The tanh activation function is:
$$\tanh \left( x \right) = 2 \cdot \sigma \left( 2 x \right) - 1$$
Is there a corresponding formula for variable pairs?
$$ \tanh(x,y) =?$$
My uneducated guess is that it would somehow include the bivariate covariance matrix $\Sigma$ instead of univariate $\sigma$.
A function $a\in\Bbb R^{\Bbb R}$ is used as a neural network's activation function viz. the multivariate extension $z_i=a((Wx+b)_i)=a(\sum_jW_{ij}x_j+b_i)$, where the parameters $b_i,\,W_{ij}$ change by backpropagation. So $\tanh(x,\,y)$ would mean $\tanh(W_{ix}x+W_{iy}y+b_i)$.
In particular, $\tanh q=2\sigma(2q)-1$ remains true; it's just that the scalar $q$ is now obtained by the above dot product. Regarding $a$ as returning the vector $z$, a vector of $1$s, which I'll denote $e$ ($1$ is also common but might be confusing here), the vector version is $\tanh q=2\sigma(2q)-e$.