and for every $v_i$ , $i=1,..,n$ the sum of coordinates of $v_i$ is $0$, then $v_1,...,v_n$ and linearly dependent.
What I tried to do: This questions was given after I learnt about matrix similarity and eigen vectors and eigen values, so I tried to think in that direction, I couldn't really get the question saying "the sum of coordinates of $v_i$ is $0$), because I always saw coordinates come with basis so I can use them? (would be appreciated if someone explains in what way should I understand that sentence).
But I proceed to think in that direction and reached something: I chose matrix $A$ to be all ones, ($a_{ij}=1\space \forall i,j) $, and let $T(v)=Av$ a linear transformation, and here I got stuck but my idea was to try to say by contradiction that they're linearly independent and reach contradiction if I find that $A$, is similiar to the zero matrix (if I use the sum of coordinates), and then explain that they can't be similar with different ranks, and prove the statement. But I have no idea how to continue and my weak understanding of the question scares me if I'm in the right direction so I came here to ask.
Appreciate all the help, Thanks in advance to everyone.
2026-03-25 19:04:54.1774465494
On
Prove or disprove: If $v_1,v_2,...,v_n$ are vectors in $\mathbb{R}^n$...
425 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
3
There are 3 best solutions below
6
On
There's a far simpler proof. Note that if the vectors were linearly independent, they would span $\mathbb R^n$.
I claim that $\bar 1$ is not in the span of these vectors.
To see why, note that $\bar1\cdot\bar1=n$, but $$\bar1\cdot (a_1\cdot v_1+\ldots +a_n\cdot v_n)=a_1\cdot(\bar1\cdot v_1)+\ldots+a_n\cdot(\bar1\cdot v_n)=0+\ldots+0=0$$ for any choice of coefficients $a_1,\ldots,a_n$.
When talking about the Euclidean space $\mathbb{R}^n$, the default basis is the standard basis.
Let $A$ be a matrix where the column vectors are $v_1,\cdots,v_n$. Then if you consider the row vector $$ w=(\underbrace{1,\cdots,1}_{\textrm{$n$ terms}}) $$ it follows from the matrix multiplication that $wA=0$, which implies that $A$ is not of full rank and hence the column vectors are linearly dependent.
Notes.
If you write $$ v_k=(a_{1k},\cdots,a_{nk})^T $$ then the matrix $A$ is
\begin{pmatrix} a_{11}&\cdots& a_{1n}\\ \vdots&\vdots&\vdots\\ a_{n1}&\cdots &a_{nn} \end{pmatrix}
Note that the $k$-th column of $A$ is the vector $v_k$.