I have the following quadratic objective function (almost variance function); where $f_n, i=1,...,n$ are $n$ function and $\overline f$ is the mean of $f_n$ for all $ i=1,...,n$
$$\min \sum(f_i(x_i)- \overline f(x))^2$$
At first I thought it is not a convex objective. So, the hessian matrixes are calculated for some examples and the results show that the determinant of the matrices are zero; however, the eigenvalues are nonnegative (i.e $\ge 0$). I realized the function is semidefinite. Now, I think it may be a convex function but I don't know how I can proof the convexity.
It seems to me that your problem is not convex unless you assume something about your functions $f_i$ (given that the minimisation is with respect to $x_i$). If, however, $f_i$ are linear functions, i.e.,
$$ f_i(x_i) = c_i'x_i + d_i, $$
then,
$$ \bar{f}(x) = \frac{1}{n}\sum_{i=1}^{n}c_i x_i + d_i $$
and the cost function becomes the square of a linear function
$$ \sum_{i=1}^n \left(c_i x_i + d_i - \frac{1}{n}\sum_{j=1}^{n}c_j x_j + d_j \right)^2, $$
which is convex.
However, in general, this cost function may not be convex. Take for example $f_i(x_i) = \sin(x_i)$.
There is a little trick one may consider using here. If $f_i$ are invertible and $f_i^{-1}$ is known or can be computed, then we can replace $f_i(x_i)$ with $y_i$ and then the problem becomes:
$$ \min_{y_1,\ldots, y_n} \sum_{i=1}^{n}(y_i - \frac{1}{n}\sum_{j=1}^{n}y_j)^2, $$
Then, once we have determined the optimisers $y_i^\star$, we can retrieve $x_i^\star=f_i^{-1}(y_i^\star)$.
Note. The square of a linear function $\phi(x) = cx + d$, where $c$ is a column vector and $x\in\mathbb{R}^n$, that is the function $\psi(x) = \phi(x)^2 = (cx + d)^2$, is covex. Indeed,
$$ \psi(x) = (cx+d)^2 = (cx)^2 + 2cx + d^2 = x'(c'c) x + 2cx + d^2, $$
And notice here that $cc'\in\mathbb{R}^{n\times n}$ is a symmetric positive semidefinite matrix, therefore, $\psi$ is convex.