In my textbook (Szekeres's A Course in Modern Mathematical Physics) the author writes
Perhaps the best way to visualize a linear functional is as a set of parallel planes of vectors determined by $\omega(v) = const$.
I am struggling to understand why this would be said. I think perhaps it has something to do with the following theorem which I now quote from my notes, but I am not sure:
[Kernel of linear functional has codimension 1, and subspace of codimension 1 is kernel of a unique linear functional] If $\varphi: V \to \mathbb{K}$ is any nontrivial linear functional then its kernel $\ker \varphi$ has codimension 1. Conversely, any subspace $W \subset V$ of codimension 1 defines a linear functional $\varphi$ on $V$ uniquely up to a scalar factor, and which is such that $W = \ker \varphi$.
I have also provided the (long) proof I wrote up of this theorem at the end, as it I think motivates the figure which the author gives:
Edit: On thinking a little more, I have come up with the following. Does it make sense?
By the proof below, since any nontrivial linear functional $\omega$ has a kernel of codimension 1, there is a correpsonding decomposition of any $v \in V$ wherein $v = au + w$ for $w \in \ker \omega$. Let $\mathcal{W}$ be a basis of $W$. Then $\mathcal{W} \cup \{u\}$ is a basis of $V$. We can think of decomposing $V$ into "planes" wherein the planes contain all those vector $v$ which have the same $a$ in their decomposition. That is, if $v = au+ w$ and $v' = au + w$, then $\omega(v) = \omega(v') = a\omega(u)$.
Proof:
For the forward direction, we have $$codim (\ker \varphi) := \dim (V/\ker \varphi) \stackrel{(1)}{=} \dim ({im}\, \varphi),$$ where (1) follows from the isomorphism between the image of a homomorphism and the factor space of a vector space by the kernel of that homomorphism. Now if $\varphi$ is nontrivial then there exists some $v \in V$ such that $\varphi(v) \neq 0$, so that $\dim ({im}\, \varphi) > 0$. Since ${im}\, \varphi \subset \mathbb{K}$ and since $\dim (\mathbb{K}) = 1$, it follows that $\dim ({im}\, \varphi) = 1$.
For the converse, let $u$ be any vector not in $W$; such a vector exists for, if not, then $W = V$ and $V/V$ has dimension 0 since it only contains the trivial element (namely, $V$). That is, $W = V$ would not have codimension 1 as supposed. Now we have $u + W \neq W$, so the subset of $V/W$ containing only $u+W$ is linearly independent. Since this subset has length 1 and the dimension of $V/W$ is 1 by hypothesis, $u+W$ forms a basis for $V/W$. In particular, for every $v \in V$ there exists an $a$ such that $v + W = a(u+W) \equiv au + W$. Thus for every $v \in V$ there exists $a, w_1, w_2$ such that $v + w_1 = au + w_2$, so for every $v \in V$ there exists $a, w$ such that $v = au + w$. This decomposition is unique, for if $v = a'u+w'$ too then $0 = v - v = (a-a')u + (w-w') \implies a=a',\, w=w'$ (since $u \notin W$). Now define a linear functional by $\varphi(w) = 0$ for $w \in W$ and $\varphi(u) = c \neq 0$ for some $c \in \mathbb{K}$, with linear extension. Then $\ker \varphi = W$ for if $w \in W$ then $w = w$ is the unique decomposition and $\varphi(w) = 0 \implies w \in \ker \varphi$, and if $v \in \ker \varphi$ then $0 = \varphi(v) = \varphi(au + w) = a\varphi(u) \implies a = 0$ for $\varphi(u) \neq 0$ (thus $v = w \in W$). For any $\varphi'$ for which $\ker \varphi' = W$ we have, for any $v =au + w \in V$ that $\varphi'(v) = \varphi'(au+w) = a\varphi'(u) = a \frac{c}{c}\varphi'(u) = ac\left( \frac{\varphi'(u)}{c}\right) = \left( \frac{\varphi'(u)}{c}\right) \varphi(v)$. Since this holds for all $v$, we have $\varphi' = \left( \frac{\varphi'(u)}{c}\right) \varphi$.

Thinking about this more geometrically might help. In Euclidean space (in really any Hilbert space, by Riesz representation), a linear functional $\omega$ is given by the dot product with some vector: $\omega(v) = a\cdot v$. The level surfaces of $\omega : \mathbb{R}^n \to \mathbb{R}$ are precisely (hyper)planes perpendicular to the vector $a$. In particular, the kernel of $\omega$ is the set of vectors perpendicular to $a$.
To intuitively see why the level surfaces are parallel planes, if $a\cdot v = c$, then $a\cdot(v-\frac{a}{|a|^2}) = 0$, i.e. $v-c\frac{a}{|a|^2}$ lies in a plane perpendicular to $a$. We can view this vector as living in the affine plane $c/|a|$ units away from the origin, since $v-c\frac{a}{|a|^2}$ is the vector connecting the tip of $c\frac{a}{|a|^2}$ (which of course has length $c/|a|$) to the tip of $v$ when the tails of the latter vectors are placed at the origin.