I have the I/O model
$$Y=HU$$
where $H$ is known. I want to design minimum input vector $U$ that gives a particular output $Y$.
I have the I/O model
$$Y=HU$$
where $H$ is known. I want to design minimum input vector $U$ that gives a particular output $Y$.
Copyright © 2021 JogjaFile Inc.
Let $y\in\mathbb{R}^n$, $H\in\mathbb{R}^{n\times m}$ and $u\in\mathbb{R}^m$ where $n<<m$. I am also assuming that $H$ is full row rank.
The goal is to find $u$ such $y=Hu$ and $||u||_2^2$ is minimum. This is a constrained convex optimization problem which can be turned into the following unconstrained optimization problem with Lagrange multiplier $\lambda$:
$$\min_u J(u,\lambda)$$ where $J(u,\lambda):=u^Tu+\lambda^T(y-Hu)$.
Computing the derivative with respect to $u$ yields
$\dfrac{\partial J}{\partial u}=2u^T-\lambda^TH$.
One can verify that this is indeed a strict minimum since the Hessian $2I$ is positive definite. So, solving for the derivative to be 0, we get
$u^*=H^T\lambda^*/2$ where the superscript $*$ indicates optimality. Substituting this in the constraint $y=Hu^*$, we obtain $y=HH^T\lambda^*/2$, which yields
$$\lambda^*=2(HH^T)^{-1}y$$
and where $(HH^T)^{-1}$ exists because $H$ is full row rank by assumption. Using this expression, we obtain that
$$u^*=H^T(HH^T)^{-1}y.$$
Note that this solution coincides with the ordinary least-square solution where you want to minimize $(y-Hu)^T(y-Hu)$ where $H$ is full row rank
If the matrix $H$ is not full-row rank, then it is possible that there is no $u$ that satisfies $y=Hu$, and you will need to consider a slightly modified problem where you want to minimize the quantity
$$(y-Hu)^T(y-Hu)+\lambda u^Tu$$
where $\lambda>0$ is a regularization parameter. This leads to the so-called ridge regression.