Let $g(w)$ be a differentiable convex function. Frank-Wolfe algorithm over a convex set $C \in \mathbb{R}^n$ is defined so as to find the local minimum of the function:
$$ s_{t+1}=\arg\min_{s \in C} \langle s, \nabla g(w_t) \rangle \tag{1} $$ $$ w_{t+1} = (1-\eta_t)w_{t}+\eta_ts_t \tag{2} $$
where $w_0 \in C$ is the starting point, $\eta_t$ is the step size in $[0,1]$, and $t=1,2,\cdots,T$.
We are done for $(1)$ because it is posed as an optimization problem.
How can we solve for $(2)$.
Try the following optimization: $$ w_{t+1}=\arg\min_{w \in C} -\beta_t\langle w,s_t -w_t \rangle + \frac{\gamma_t}{2} \|w - w_t\|_2^2 \tag{3} $$ where $\beta_t$ and $\gamma_t$ are positive.
Since the objective, i.e., $f(w) = -\beta_t\langle w,(s_t -w_t \rangle + \frac{\gamma_t}{2} \|w - w_t\|_2^2$ is convex, the necessary and sufficient condition in order $w_{t+1}$ be the minimizer of $(3)$ is the following:
$$ \langle -\beta_t (s_t -w_t) + \gamma_t (w-w_t), w - w_{t+1} \rangle \geq 0 \,\,\,\, \forall w \in C $$
$$ \langle w - ( \frac{\beta_t}{\gamma_t}s_t + (1-\frac{\beta_t}{\gamma_t})w_t), w - w_{t+1} \rangle \geq 0 \,\,\,\, \forall w \in C $$