Consider the following scalar discrete-time control system $$x(k + 1) = 2x(k) + u(k)$$ where $x \in \mathbb R$, $u \in \mathbb R$ and $x(0) = −2$. Let $N > 1$ be some integer.
Consider the following cost function $$J = \sum_{k=0}^{N} u(k)^2$$ Find the optimal control law and the minimal cost.
Consider the following cost function $$J = \sum_{k=0}^{N} x(k)^2$$ Find the optimal control law and the minimal cost.
Consider the optimal control problem of $2.$ with the constraint $|u(k)| \leq 0.5$. Find the optimal control law.
I got stuck in the part where we need to find the control law and minimal law, I used Riccati equation but I don't feel like it's correct.
I will give you some hints:
$(a)$ Your cost does not depend on the state $x$. You only have $u$ in it, which is your control input (you have total control about what will happen with $u(k)$). Do you see how the cost can be minimized if we do not care about the state $x$?
$(b)$ The cost now depends on the state $x$. As long as the state is not zero we will accumulate additional cost. Can you think of a control input $u(k=0)$ such that $x(k=1)=0$? This means that we are driving the system directly to the origin. This type of control is called dead beat control and is not possible for linear continuous time-invariant systems but possible for linear time-invariant discrete systems. Use $x(k+1) = 2x(k) + u(k)$ and the initial condition. Remember that this problem does not give us any constraints on the inputs $u(k)$. If the system is in the origin after one step what will $u(k)$ be for $k\geq 1$?
For $(c)$ try to understand what the system does for $u(k)=0.5$ and $u(k)=-0.5$. We only consider the state $x(k)$ in our cost function. How could you minimize this cost given the constraint $|u(k)|\leq 0.5$? Also note that the cost function is only a finite sum.