Let $X$ be a vector space and consider a function $f : X \rightarrow \mathbb{R}$ defined for some $a \in X$ defined as $f_a (x) = a \cdot x$.
(i) Prove that $f_a (x) = a \cdot x$ is a linear function. This part is super easy, as I just show that $f_a(x+y) = f_a(x) + f_a(y)$ and $f_a(rx) = rf_a(x)$.
Parts ii-iv are the parts I do not understand. Is there somewhere I can go to learn about this stuff (I guess maybe I'm just intimidated by working with functions as elements of a set)?
(ii) Let $X^{*} = \{f : X \rightarrow \mathbb{R} : f$ is linear$\}$ be the set of all linear functions from $X$ into $\mathbb{R}$. Prove that for all $f \in X^*$ there exists an $a \in X$ such that $f(x) = a \cdot x$.
(iii) Define function addition as $f + g = f(x)+ g(x)$ and function scaling as $\alpha f = \alpha f(x)$ over the set $X^{*}$. Prove or disprove the following statement: $X^{*}$ with the defined operations is a linear vector space.
(iv) What is dimension of $X^{*}$? (I see that this would just be the dimension of the basis of the subspace created by $X$*, just not sure how to create this basis)
Let the dimension of $X$ be $n$. To clarify your doubt, I will answer the question assuming that $X=\mathbb{R}^{n}$. Let $b_{1}=(1,0,\ldots,0)^{T},b_{2}=(0,1,0,\ldots,0)^{T},\ldots,b_{n}=(0,\ldots,0,1)^{T}$ be the standard bases for $X$. Now, consider the functions $f_{1},\ldots,f_{n}:X\rightarrow \mathbb{R}$ defined as follows: for any $x=(x_{1},\ldots,x_{n})^{T}\in X$,
\begin{eqnarray} f_{1}(x)=x_{1},~~~~~~~\ldots,~~~~~~~f_{n}(x)=x_{n}. \end{eqnarray}
Clearly, $f_{i}\in X^{*}$, $i=1,\ldots,n$.
The claim is that $f_{1},\ldots,f_{n}$ form a basis for $X^{*}$. To prove this, we need to prove two things: span and linear independence.
(a) Span: Consider an arbitrary $f\in X^{*}$. Suppose that
\begin{align} f(b_{1})&=y_{1}\\\vdots\\f(b_{n})&=y_{n}. \end{align}
Also, suppose that $A$ denotes the $n\times n$ matrix whose rows are $b_{1},\ldots,b_{n}$, and let $y=(y_{1},\ldots,y_{n})^{T}$. Then, the system
\begin{eqnarray} Ax=y \end{eqnarray} has a unique solution $x^{*}=y$ (since $A$ has full rank by the virtue of the fact that $b_{1},\ldots,b_{n}$ are linearly independent). Then, we may express $f$ as
\begin{eqnarray} f={y_{1}}f_{1}+\ldots+{y_{n}}f_{n}, \end{eqnarray} or \begin{eqnarray} f(x)&={y_{1}}f_{1}(x)+\ldots+{y_{n}}f_{n}(x)\\&={y_{1}}x_{1}+\ldots+{y_{n}}x_{n}\\&=y^{T}x. \end{eqnarray}
Thus, $f_{1},\ldots,f_{n}$ span $X^{*}$. Also, from the above exposition, we have shown that there exists an$a\in X$ such that $f(x)=a^{T}x$ for all $x\in X$, and it is $a=y$.
(b) Linear independence: Suppose that for some constants $c_{1},\ldots,c_{n}$, we have $c_{1}f_{1}+\ldots+c_{n}f_{n}=0$, i.e., for all $x=(x_{1},\ldots,x_{n})^{T}\in X$, we have
\begin{eqnarray} c_{1}f_{1}(x)+\ldots+c_{n}f_{n}(x)&=0. \end{eqnarray}
In particular, this means that $c_{1}f_{1}(b_{1})+\ldots+c_{n}f_{n}(b_{1})=0$ from which we conclude $c_{1}=0$. In a similar way, we can show that $c_{2},\ldots,c_{n}$ are all zero as well. This establishes that $c_{1}f_{1}+\ldots+c_{n}f_{n}=0$ iff $c_{1}=0=c_{2}=\ldots=c_{n}$, which in turn establishes linear independence.
Thus, the dimension of $X^{*}$ is same as dimension of $X$.
The same logic can be applied to any $X$. Just take care to replace $1$ with the multiplicative identity and $0$ with the additive identity of the underlying field governing $X$.