Obviously, planar rotation matrices satisfy this requirement since they can be parameterized as $$\pmatrix{{\cos(\theta) \ \ \ \sin(\theta)}\\{-\sin(\theta) \ \ \cos(\theta)}}$$
However, "hyperbolic rotation" matrices also satisfy this requirement since $$\pmatrix{\cosh(\theta) \ \sinh(\theta) \\ \sinh(\theta) \ \cosh(\theta)}\pmatrix{\cosh(\theta) \ -\sinh(\theta) \\ -\sinh(\theta) \ \cosh(\theta)}=1$$
Are there more matrices in this class? Does this sort of matrix have a name?
Some digging about this question:
In general, by example from $A:=\left[\begin{smallmatrix}0&-1\\1&0\end{smallmatrix}\right]$ and $B:=\left[\begin{smallmatrix}1&0\\0&-1\end{smallmatrix}\right]$, we can see that these matrices doesn't form a group under matrix multiplication or matrix addition. I don't know if these matrices have a name (probably not because they are not a group under matrix multiplication or matrix addition) but the condition for $n\times n$ matrices can be stated as
$$A(2D-A)=2AD-A^2=I\tag1$$
for $D$ the matrix that is the diagonal of $A$. And because $A$ is invertible then from $(1)$ we have that
$$2D=A+A^{-1}\implies AD=DA\implies a_{k,k}a_{j,k}=a_{j,j}a_{j,k},\quad\forall j,k\in\{1,\ldots,n\}\tag2$$
Then we can see two cases from here:
$A$ is a diagonal matrix: if $A$ is diagonal then $D=A$ so the equation on $(1)$ reduces to $D^2=I$, what is easy to handle and analyze.
$A$ is not a diagonal matrix: then there is some $a_{j,k}\neq 0$ for $j\neq k$, then from $(2)$ this implies that $a_{j,j}=a_{k,k}$. Some special cases easier to handle are the following:
2.1. Simple non-zero diagonal: if there is a $a_{j,j}\neq 0$ for some $j\in\{1,\ldots,n\}$ and a collection of $n-1$ coefficients $a_{j,k}\neq 0$ such that the pairs $(j,k)$ defines a connected graph on $n$ vertices then $D=\lambda I$ for some $\lambda\neq 0$. Proof: the idea is that there is a connected graph of coefficients that forces the condition stated in $(2)$ for all $a_{k,k}$. By example the above condition is fulfilled when $A$ have some file or column with non-zero coefficients.
2.2 Zero diagonal: if $D=0$ then $(1)$ simplifies to $A^2=-I$, what is equivalent to say that $A^{-1}=-A$, what is easier to handle and analyze.
All what follows from here is for the case 2.1. From $(1)$ we find these characterizations for this subset of matrices
$$A^2+I=\eta A\quad\text{ or }\quad (A+ I)^2=\mu A\tag3$$
where $\eta:=2\lambda$ and $\mu:=2(1+\lambda)$. Also from $(3)$ we find that $$p(x):=x^2-2\lambda x+1=(x-\lambda-\sqrt{\lambda^2-1})(x-\lambda+\sqrt{\lambda^2-1})\tag4$$ is a multiple of the minimal polynomial of $A$ because $p(A)=0$.
Then we also knows that $A$ have at most two distinct eigenvalues (that are not zero any of them, otherwise $A$ cannot be invertible). Also by the trace of $A$ we knows that
if $A$ have only one eigenvalue then it multiplicity is obviously $n$ and it value is $\lambda$ because $\operatorname{trace}(A)=\lambda n$. In this case, from $(4)$, we find that $\lambda=1$ or $\lambda=-1$.
if $A$ have two distinct eigenvalues then $$\operatorname{trace}(A)=\lambda n=(n-k)r_1+kr_2$$ for the eigenvalues $r_1$ and $r_2$, and for some $k\in\{1,\ldots,n-1\}$, where from $(4)$ it can be seen that $r_1=\lambda+\sqrt{\lambda^2-1}$, $r_2=\lambda-\sqrt{\lambda^2-1}$ and $r_1\cdot r_2=1$, so $r_1=r_2^{-1}$.
Putting all together we find that $$\operatorname{trace}(A)=\lambda n=\lambda n+n\sqrt{\lambda^2-1}-2k\sqrt{\lambda^2-1}\implies n=2k$$ what imply that the two eigenvalues $r_1$ and $r_2$ have the same multiplicity, and consequently $\det(A)=r_1^k\cdot r_2^k=1$. Also from here we reach the conclusion that if $n$ is odd then necessarily $A$ have only one eigenvalue because $n/2\notin\Bbb N$.
Also if $\lambda\in(-1,1)\setminus\{0\}$ then we find that $r_1=\overline{r_2}$ and consequently $r_k\overline{r_k}=|r_k|^2=1$ so $r_1,r_2\in\Bbb S^1$.
Also we can see from $(4)$ that, in any case, at least one eigenvalue can be chosen freely from $\Bbb C\setminus\{0\}$.