I am trying to calculate model free implied volatility $\sigma_{\mathrm{MF}}$ for a relative performance index using the following method:
$$ \sigma_{\mathrm{MF}}^2=2\sum_{i} \left[\frac{C(T,K_{i})}{K_{i}^2} - \frac{\max(0,F-K_{i})}{K_{i}^2}\right]\Delta K_{i}, $$ where $$ F=I\exp\left(\left(\frac{\sigma_{\mathrm{M}}^2-\sigma_{\mathrm{S}}^2+\sigma_{\mathrm{MF}}^2}{2}\right)T\right) $$
The only unknown here is $\sigma_{\mathrm{MF}}$.How can I implement this using Matlab? I am confused as to how I can use nonlinear optimization functions when the unknown $\sigma_{\mathrm{MF}}$ is itself inside a loop.