This question is related to the previous one. Consider $n$ variables $x_1,x_2,\ldots,x_n$ and the following $n\times n$ matrix:
$$ A=\begin{bmatrix} 1 & \cdots & 1 \\ x_2 + x_3 + \dots + x_n & \dots & x_1 + x_2 + \dots + x_{n-1} \\ x_2{x_3} + x_2{x_4}+ \dots + x_{n-1}x_n & \dots & x_1{x_2} + x_1{x_3}+ \dots + x_{n-2}x_{n-1 } \\ \vdots & \dots & \vdots\\ x_2 x_3 \dots x_n & \dots & x_1 x_2 \dots x_{n-1} \\ \end{bmatrix}. $$ When $i>1$, the element $a_{ij}$ is the sum of all possible products of $i-1$ variables $x_k$'s with distinct indices, except that $x_j$ is not participating in any term on column $j$. Formally, $$ a_{ij}=\sum_{k_1<\cdots<k_{i-1} \text{ and they are } \ne j} x_{k_1}x_{k_2}\cdots x_{k_{i-1}}. $$
Of course, when some $x_i=x_j$, $A$ has two equal columns and it becomes singular, but is this the only possibility for $\det A=0$?
The Vee is right. This is a Vandermonde determinant, but I think there is a simpler derivation. To stress the dimension of $A$ and its dependence on $x_1,\ldots,x_n$, we denote the matrix by $A_n(x_1,\ldots,x_n)$ instead. Note that when $i,j>1$, we have $$ \begin{align*} a_{ij}-a_{i1}=(x_1-x_j)\sum_{k_1<\cdots<k_{i-2} \text{ and they are } \ne 1,j} x_{k_1}x_{k_2}\cdots x_{k_{i-2}}. \end{align*} $$ Therefore, if we subtract the first column from every other column, we get $$ \begin{bmatrix} 1 & 0\\ \ast & A_{n-1}(x_2,\ldots,x_n)\operatorname{diag}(x_1-x_2,\ldots,x_1-x_n)\ \end{bmatrix} $$ and hence $\det A_n(x_1,\ldots,x_n)=(x_1-x_2)\cdots(x_1-x_n)\det A_{n-1}(x_2,\ldots,x_n)$. Proceed recursively, we obtain $$ \det A_n(x_1,\ldots,x_n)=\prod_{i<j}(x_i-x_j) $$ and the determinant vanishes if and only if $x_i=x_j$ for some two $i,j$.