I am trying to derive the relation $\mathbf A=\mathbf V^T\boldsymbol\Lambda\mathbf V=\sum_{i=1}^n\lambda_i\mathbf v_i\mathbf v_i^T$, where $\mathbf A$ is symmetric, $\mathbf V$ orthogonal (where $\mathbf v_i$ is the $i$-th column of $\mathbf V$) and $\boldsymbol\Lambda=\text{diag}(\lambda_1,\dots,\lambda_n)$ diagonal matrix with eigenvalues of $\mathbf A$ on its main diagonal.
To do that, I am using the equivalence $\boldsymbol\Lambda=\sum_{i=1}^n\lambda_i\mathbf e_i\mathbf e_i^T$, where $\mathbf e_i$ is the $i$-th column (or row) of an identity matrix. Hence
$\mathbf A=\mathbf V^T\boldsymbol\Lambda\mathbf V=\mathbf V^T\left(\sum_{i=1}^n\lambda_i\mathbf e_i\mathbf e_i^T\right)\mathbf V=\sum_{i=1}^n\lambda_i\mathbf V^T\mathbf e_i\mathbf e_i^T\mathbf V=\sum_{i=1}^n\lambda_i\left(\mathbf V^T\mathbf e_i\right)\left(\mathbf V^T\mathbf e_i\right)^T=\sum_{i=1}^n\lambda_i\mathbf w_i\mathbf w^T$
... where $\mathbf w_i$ is the $i$-th row of $\mathbf V$. So my question is, am I doing something wrong, or in reality it does not matter if we take a column or a row of an orthogonal matrix, since both of them are associated with the same eigenvalue $\lambda_i$?
You got the decomposition the wrong way around. The way you stated it in the first sentence it should be $A = V\Lambda V^T$. Of course, you can also find a decomposition $A = V^T\Lambda V$, but then the eigenvectors are the rows instead of the columns of $V$, as you actually showed in your proof.