In E. Oeljeklaus & R. Remmert Linear Algebra they proof this little lemma:
Lemma: Let $V$ be a finite dimensional $F$-Vectorspace over a field $F$ and $U \subset V$, then $$U = \bigcap_{\lambda \in U^0} \ker ( \lambda) $$ Where $U^0:= \lbrace \lambda \in V^v: \lambda_{ | U} = 0 \rbrace$ is the 'vanishingspace' of $U$. $V^v$ denotes the Dualspace.
The proof of the lemma is rather straight forward, besides that they use contraposition to show the $\supset$ inclusion.
As an exercise they suggest to proof the lemma showing that there exists a finite family $(\lambda_j)_{j=1}^n \in (V^V)^n$ of linear forms on $V$ such that $$ U= \bigcap_{j=1}^n \ker (\lambda_j) $$ and here I struggle. It makes sense that the lemma can be shown for such a finite family because we have $$ U^0 \subset V^V \implies \dim(U^0) \leq \dim(V^V)= \dim(V) < \infty $$ Hence our set $U^0$ is finite and we can choose a basis $\lambda_1, \lambda_2, \dots , \lambda_k$ for some $k \in \mathbb{N}$ of $U^0$ with $\lambda_i$ being linear forms on $V$ such that $\lambda_{i}(u)=0$ for all $i = 1, \dots , k$ this would also readily take care of the $\subset$ inclusion.
However I am not quite sure on how to approach $\supset$. Because I feel like I want to include the basis in it, but then I don't see how I could achieve this.
If $U = V$, take $\lambda \equiv 0$ on $V$, so let's assume w.l.o.g. $U \subsetneq V$. Let $B = \{b_1,...,b_m\}$ be a basis of $U$ and extend $B$ by $C = \{c_1,...,c_n\}$ to a basis of $V$, then
$$ U = \mathrm{span}({b_1, ..., b_m}), \; U_C := \mathrm{span}({c_1,...,c_n}) $$
and we can write
$$ V = U \oplus U_C $$
Now let $C^{*}$ be the dual set of $C$. Then $C^{*}$ consists of the functionals $\lambda_j$ for $j = 1...n$ that are defined as follows:
$$\lambda_j : V \rightarrow F, \quad \lambda_j (\sum_{i=1}^{m} x_i \cdot b_i + \sum_{k=1}^{n} y_k \cdot c_k) = y_j $$
We see that $v \in V$ lies in $U$ if and only if $y_j = 0$ for $j = 1...n$. That is the same as to say
$$ U = \bigcap_{j=1}^{n} \mathrm{ker}(\lambda_j)$$