Suppose I have a set of functions: $f_n(x) = x^2 + b_n x$, with $b_n$ a random variable.
We define $f(x) = E_n[f_n(x)]$, here $f(x) = x^2 + E[b_n] x$.
The minimizer of the function $f_n$ and $f$ are given by: $\theta_n = \arg \min_x f_n $, $\theta = \arg \min_x f$
In this case we have: $\theta_n = -\frac{b_n}{2}$, $\theta = -\frac{E[b_n]}{2}$,
So we have : $E[\theta_n]= \theta$.
My question: is this the general case for the other form of functions $f_n$. what if $f_n$ has a more completed expression, even not able to write the analytical expression of the minimizer $\theta_n$.
This does not work in general.
Note that multipliying $f_n$ with a function $g(b_n)$ does not change the argmin of $f_n$, but will in general change the argmin of $f$.
Consider for example $f_n(x) = b_n^2(x - b_n^2)^2 = b_n^2x_n^2 - 2b_n^4x + b_n^6$. Then the expectation of the argmin is $E[b_n^2]$, while the argmin of the expectation is $\frac{E[b_n^4]}{E[b_n^2]}$. Unless $b_n^2$ is constant, you have $\frac{E[b_n^4]}{E[b_n^2]} < E[b_n^2]$ by the Cauchy-Schwarz inequality.