I have an inequality constrained non-linear least squares optimization problem which seeks to find the vectors $\bf{\alpha}$ and $\bf\beta$ that minimize the following cost function:
$$F({\bf\alpha,\bf\beta})=\sum^{N-1}_{n=0} \left ( \frac{x_n - {\bf\beta}^T {\bf t}_n}{{\bf \alpha}^T{\bf t}_n} - y_n \right)^2$$
where $x_n$, $y_n$, and ${\bf t}_n$ are all known, and the parameters are subject to the following constraints:
$${\bf \beta}^T{\bf t}_n\in[b_0,b_1]\quad {\bf \alpha}^T{\bf t}_n\in[a_0,a_1]\quad \forall n\in\{0,1,\cdots,N-1\}.$$
In general, this problem is non-convex--even when $\bf\alpha$ and $\bf\beta$ are scalars. However, it turns out that for the scalar case, there are no local minima (observed by 3-D plotting out the cost function vs the two parameters).
I would like to know under what conditions will this be the case for the more general scenario of vectors $\bf\alpha$ and $\bf\beta$. Assume that the number of unknowns (i.e., the length of $\bf\alpha$ and $\bf\beta$) is much less than the number of observations $N$.