I have a the following regression problem, where I have data points $(x_i, y_i)$ and $f_{\theta}$ is the regression function parametrized by $\theta$. For example, in linear regression, $f$ is just a linear function so $f() = \theta^Tx_i$. The optimal $\theta$ is given by:
$$ \theta^* = argmin_{\theta \in \Theta} \sum_{i=1}^{n}{|y_i - f_{\theta}(x_i)|^p}$$
I am wondering under what conditions, is there always a unique solution to this arg min. I know if the function $|y_i - f_{\theta}(x_i)|^p$ is strictly convex, then there is a unique solutions since sum of strictly convex functions in strictly convex. But then, what are the conditions on $f_\theta$ such that this is strictly convex.
Attempt: I know that the p-norm is strictly convex for $p>1$. However, I'm not too sure how helpful that is since $|x^2-3|^2$ for example is not strcitly convex.
Thanks in advance!