Let $S =\{u_1, u_2,...,u_n\}$ be a finite set of vectors. Prove that $S$ is linearly dependent if and only if $u_1 = 0$ or $u_{k+1} ∈ \text{span} (\{u_1, u_2,...,u_k\})$ for some $k\space (1 ≤ k<n).$
I solved the problem as follows: If $u_1=0$ then $S$ is linearly independent as $1.0=1.u_1=0.$
If $u_{k+1} ∈ \text{span} (\{u_1, u_2,...,u_k\})$ then, $\exists$ scalars $c_1,...,c_k$ such that $$u_{k+1}=c_1u_1+...+c_ku_k\implies u_{k+1}-c_1u_1-...-c_ku_k=0,$$ which means $S$ is linearly dependent.
Now, for the other part we assume that $S$ is linearly dependent. If $u_1=0\in S$ then we are done. Else, from the definition of linear dependent sets we can find a finite set of vectors $a_1,...,a_{k+1}$ and scalars $c_1,...,c_{k+1}$ not all zero, such that $c_1a_1+...+c_{k+1}a_{k+1}=0$. Assuming $c_{k+1}\neq 0$ we have, $$a_{k+1}=c_{k+1}^{-1}c_1a_1+c_{k+1}^{-1}c_2a_2+...+c_{k+1}c_ka_k\implies a_{k+1}\in\text{span} (\{a_1, a_2,...,a_k\}).$$ Let $u1=a_1,u_2=a_2,...,u_k=a_k,u_{k+1}=a_{k+1}$ and then, we have, $u_{k+1} ∈ \text{span} (\{u_1, u_2,...,u_k\}).$
This completes the proof.
I think the question is ambiguous because if $u_{k+1} ∈ \text{span} (\{u_1, u_2,...,u_k\})$ is the claim (when $u_1\neq 0$), then they implicitly assume, that the coefficient of $u_1$ is always equal to $0$ if $u_1\neq 0.$ This is because, it was also a possible case, that $u_2$ was a linear combination of the vectors $$u_1,...,u_n(\implies u_2\in \text{span}(\{u_1,u_3,...,u_n\}))$$ and in addition, we have $u_2\notin \text{span}(\{u_1\}).$ Then the claim in the question would've been incorrect. The question is thus, wrong if we examine it carefully. Am i correct?
You are really overthinking this. This answer is inspired from the comment of the user ancient mathematician.
Your first part which shows that if $u_1=0$ or if for some $k\in [1,n)$ such that $u_{k+1}\in\text{span}(\{u_1,u_2,...,u_{k-1}\})$ then $S$ is linearly dependent is correct. Also all's perfect upto the point where you showed, if $S$ is linearly dependent and $u_1=0$ then we are done.
The problem arises with the way you tried to prove, the statement:
If $S=\{u_1,u_2,...,u_n\}$ is a linearly dependent set then $\exists k$ satisfying $1\leq k\lt n$ such that, $u_{k+1}\in\text{span}(\{u_1,u_2,...,u_{k-1}\}).$
We give a proof of this (above) mentioned statement below:
Given $S=\{u_1,u_2,...,u_n\}.$ Since, $S$ is linearly dependent, there exist scalars $a_1,...,a_n$ not all zero such that $$a_1u_1+a_2u_2+...+a_nu_n=\sum_{i=1}^na_iu_i=0.$$ We choose the largest $1\leq k\lt n$ such that $a_{k+1}\neq 0.$ Then, $a_{k+1}u_{k+1}=a_1u_1+a_2u_2+...+a_{k}u_{k}+a_{k+2}u_{k+2}+...+a_nu_n\implies a_{k+2}=a_{k+3}=\cdots=a_n=0$ as $k$ is the largest of all the integers such that $a_{k+1}\neq 0.$ So, $$a_{k+1}u_{k+1}=a_1u_1+a_2u_2+...+a_{k}u_{k}+a_{k+2}u_{k+2}+...+a_nu_n=a_1u_1+a_2u_2+...+a_{k}u_{k}+0.u_{k+2}+\cdots +0.a_n=a_1u_1+a_2u_2+...+a_{k-1}u_{k-1}\implies u_{k+1}=a_{k+1}^{-1}a_1u_1+...+a_{k+1}^{-1}a_{k-1}u_{k-1}\implies u_{k+1}\in\text{span}(\{u_1,u_2,...,u_{k-1}\}).$$
This is all to it! With this modification, the proof in the original post should be fixed.