Let $V$ be a vector space over field $F$ and let $A\subset V$ be a non empty subset of $V$. It is to be proven that $A$ is an affine subset of $V$ if and only if $\lambda v+(1-\lambda)w\in A$ for all $v,w\in A$ and for all $\lambda \in F$.
I tried to prove it like this:
$(\Rightarrow)$ If $A$ is an affine subset of $V$ then there exists (by definition) a subspace $U$ and a vector $v\in V$ such that $A=v+U$. Hence for any $a,a'\in A$, there exist $u,u'\in U$ such that $a=v+u$ and $a'=v+u'$. For any $\lambda \in F$, it follows that $\lambda a+(1-\lambda)a'=v+\lambda u+(1-\lambda)u'\in v+U$. This proves the claim in this direction.
For the other direction: $(\Leftarrow)$ It follows (by taking $\lambda =\frac 12$) that for any $u,w\in A$, $\frac{u+w}2\in A$.
Also $u+\lambda (w-u)\in A$, and from here fixing $w,u$ gives a subspace $W=span(w-u)$ but I don't know how to proceed.
Any help is much appreciated. Thanks.
Let's fix $v \in A$. Set $U:=\{w-v \mid w \in A\}$. I claim that $U$ is a subspace of $V$.
If we let $u \in U$ with $u=w-v$ and $\lambda \in F$, then we get that $\lambda u =\lambda w - \lambda v=\lambda w +(1-\lambda )v-v \in U$, so that $U$ is closed under scalar multiplication.
Let $u,u' \in U$ with $u=w-v, u'=w'-v$, then $\frac{u+u'}{2}=\frac{w+w'}{2}-v$. As you have noted taking $\lambda =\frac{1}{2}$ yields that $\frac{w+w'}{2} \in A$, because $w,w' \in A$. So this implies that $\frac{u+u'}{2}=\frac{w+w'}{2}-v \in U$. As we already know that $U$ is also closed under scalar multiplication, we also get $u+u'=2\frac{u+u'}{2} \in U$.
Also $U$ is clearly non-empty.
Thus $U$ is a vector subspce of $V$. By definition of $U$, we have $A=v+U$, thus $A$ is an affine subspace.