Starting from Prove $\mathbf{\det(I+xy^T+uv^T)}=(1+\mathbf{y^Tx})(1+\mathbf{v^Tu)-(x^Tv)(y^Tu)}$ for example, is it possible to generalize as follows?
If $w_i$, $1\leq i\leq n$ is a basis for $\mathbb{R}^n$, is there a closed formula for $\det(nI_n-\sum_{i=1}^n w_i\otimes w_i)$?
It is very easy for example if $w_i$ is the standard basis.
The formula you linked to is just Leibniz's formula in disguise. If you put $U=\pmatrix{y&v},\,V=\pmatrix{x&u}$ and $A=U^TV=\pmatrix{a&b\\ c&d}$, then \begin{aligned} \det(I_n+xy^T+uv^T) &=\det(I_n+VU^T)=\det(I_2+U^TV)=\det(I_2+A)\\ &=(1+a)(1+d)-bc\\ &=(1+y^Tx)(1+v^Tu)-(y^Tu)(v^Tx). \end{aligned} This, of course, can be generalised to higher dimensions because Leibniz's formula also works in higher dimensions, but from a computational point of view, the practical usefulness of this formula diminish quickly when the matrices are getting larger and larger.
In general, suppose $U$ and $V$ are two $m\times n$ matrices. Denote the $j$-columns of $U$ and $V$ by $u_j$ and $v_j$ respectively. Let $A=U^TV$. Then \begin{aligned} \det(xI_n-U^TV)&=\det(xI-A)\\ &=\sum_{r=0}^n\sum_{|J|=r}(-1)^r\det\left(A(J,J)\right)x^{n-r}\\ &=\sum_{r=0}^{\min(m,n)}\sum_{|J|=r}(-1)^r\det\left(A(J,J)\right)x^{n-r}\\ &=\sum_{r=0}^{\min(m,n)}\sum_{|J|=r}(-1)^r \sum_{\sigma\in S_r}\operatorname{sign}(\sigma)\prod_{i=1}^r a_{J(i),J(\sigma(i))} x^{n-r}\\ &=\sum_{r=0}^{\min(m,n)}\sum_{|J|=r}(-1)^r \sum_{\sigma\in S_r}\operatorname{sign}(\sigma)\prod_{i=1}^r u_{J(i)}^Tv_{J(\sigma(i))} x^{n-r}. \end{aligned} Using the identity $x^m\det(xI_n-U^TV)=x^n\det(xI_m-VU^T)$, one obtains $\det(xI_m-VU^T)$ as well.