The context is ordinary multivariate regression with $k$ (>1) regressors, i.e. $Y = X\beta + \epsilon$, where $Y \in \mathbb{R}^{n \times 1}$ vector of predicted variable, $X \in \mathbb{R}^{n \times (k+1)}$ matrix of regressor variables(including ones in the first column) $\beta \in \mathbb{R}^{(k+1) \times 1}$ vector of coefficients, including intercept.
Say, I have already estimated $\beta$ as $\hat{\beta} = (X'X)^{-1} X'Y.$
I have to solve the following program:
minimize $f(B) = L\beta$ ( $L$ is a fixed $1\times (k+1)-$vector ) such that: $[(\beta-\hat{\beta})' \cdot X'X \cdot (\beta-\hat{\beta})] / [(Y - X\hat{\beta})' (Y - X\hat{\beta}) ]$ is less than a given value $c$.
Note that this is a linear optimization program with respect to $\beta$ with quadratic constraints.
I don't understand how we can solve this optimization - I was going through some online resources, each of which involve manually computing gradients of the objective as well as constraint functions - which I want to avoid (at least manually doing this).
Can you please help solve this optimization problem in R ? The inputs would be: $X$, $Y$, $\hat{\beta}$, $L$ and $c$
Please let me know if any further information is required - the set-up is pretty general.
The
nloptprpackage lets you handle such problem. You can specify an algorithm there. Many of the algorithms do require gradients. However, theCOBYLA-algorithm (Constraint Optimization by Linear Approximation) does not. It will approximate the solution until it is reasonably close to the optimum (you can specify this) or as soon as it reachs the maximum number of iterations you allow it to do. An R-Code for this could look like this:As you did not provide a lot of information about the data, I could not give any more detailed information concerning the constraints and the objective.