We have vectors $v_1,\dots,v_n$ in $\mathbb{R}^m$ with $n \leq m$. Each $v_i$ is a "probability vector" in the sense that its entries are all non-negative, and sum to one.
Suppose that $\{v_i\}$ is linearly dependent. Does it follow that some $v_i$ is a convex combination of the other vectors?
I think I can get some sort of geometric intuition going for this but have no idea how to settle whether it's true in general.
Consider in $\Bbb R^4$ the following four (column) vectors:
$$[v_1,v_2,v_3,v_4]=\begin{bmatrix}1&0&\frac12&0\\ 0&1&0&\frac12\\ 0&0 &\frac12&\frac12\\ 0&0&0&0\end{bmatrix}$$
They are linearly dependent. Notice that any three of them are linearly independent, which means that for any permutation $(i,j,k,l)$ of $\{1,2,3,4\}$ there is exactly one way to write the vector $v_i$ as a linear combination $v_i=\alpha_{ij}v_j+\alpha_{ik}v_k+\alpha_{il}v_l$ of the other three. Your claim would be that there is some $i$ such that $\alpha_{ij}+\alpha_{ik}+\alpha_{il}=1$ and $\alpha_{ij},\alpha_{ik},\alpha_{il}\ge0$. This is easily disproved by inspection \begin{align}v_1&=v_2+2v_3-2v_4\\ v_2&=v_1-2v_3+2v_4\\ v_3&=\frac12v_1-\frac12v_2+v_4\\ v_4&=-\frac12v_1+\frac12v_2+v_3\end{align}