Suppose that we want to choose a function $f: \mathbb{R} \to \mathbb{R}, x \mapsto f(x)$ from a class of functions $\mathcal{F} = \{ g| g(x) = wx, w \in \mathbb{R}$} such that it minimizes the functional \begin{equation} J(f) = \dfrac{1}{n} \sum\limits_{i = 1}^n |f(x_i) - y_i| \end{equation} $x_i, y_i \in \mathbb{R}$
Is this an example of "calculus of variation"?
I was told in the past that any optimization involving minimising over a function space is "calculus of variation". But I was looking at some examples, and it seems that there are a lot more integrals involved in the usual calculus of variation problems as compared to this problem.
Can someone confirm or deny?
Of course your ${\cal F}$ is a function space, but these functions are terribly restricted. In fact we have a single free real variable $w$ which has to be chosen such that the quantity $$q(w):={1\over n}\sum_{i=1}^n \bigl|w\, x_i-y_i\bigr|$$ for given sample points $(x_i,y_i)$ $\,(1\leq i\leq n)$ is minimized. This is a standard problem, and has a single solution $w_*\in{\mathbb R}$. No "free" candidating functions are involved here.
For a real variational problem the feasible set of candidating functions $g$ would be an infinite dimensional set of "arbitrary" functions defined on an interval $[a,b]$, satisfying some conditions like $g(a)=g(b)=0$, or similar. Only then we would obtain, e.g., a differential equation that potential optimal functions $g$ would have to satisfy.