About LQR control for scalar discrete-time control system

125 Views Asked by At

It's an old post

Consider the following scalar discrete-time control system $$x(k + 1) = 2x(k) + u(k)$$ where $x \in \mathbb R$, $u \in \mathbb R$ and $x(0) = −2$. Let $N > 1$ be some integer.

  1. Consider the following cost function $$J = \sum_{k=0}^{N} u(k)^2$$ Find the optimal control law and the minimal cost.
  1. Consider the following cost function $$J = \sum_{k=0}^{N} x(k)^2$$ Find the optimal control law and the minimal cost.
  1. Consider the optimal control problem of $2.$ with the constraint $|u(k)| \leq 0.5$. Find the optimal control law.

I'm new to optimal control theory and confused now by the first hint given in the first answer to the original post.

"Do you see how the cost can be minimized if we do not care about the state x?", does this means that a zero $u(t)$ will minimize the cost? If so, what if $J=\sum_{k=1}^{+\infty}||u(t)||^2$, zero again?