Optimizing a vector equation

122 Views Asked by At

I'm looking for the optimal solution of the following problem.

Let $x$ and $b$ be two vectors of real numbers with the same dimension.

Let $\alpha$ be a scalar value.

We are looking for the optimal $\alpha$, which minimizes the squared euclidean distance between $x$ and $\alpha b$

$$ D = ||x - \alpha b||^2 $$

$$ D = x^T x - 2 \alpha x^T b + \alpha^2 b^T b $$

I took the derivative wrt. $\alpha$, set it to 0 and did the algebra:

$$\eqalign {\frac {dD} {d \alpha} &= 2 \alpha b^Tb - 2x^Tb \cr &= 0 \cr x^Tb &= \alpha b^Tb \cr \frac {x^Tb} {||b||^2} &= \alpha} $$

Did I manage to get it right, or did I make a mistake somewhere? I am no mathematician and I'm not quite sure I know all the rules and caveats of vector algebra.

2

There are 2 best solutions below

1
On BEST ANSWER

Your computations are almost correct. At the end it should read:

$\frac {x^Tb} {||b||^2} = \alpha.$

0
On

In fact, there is no real risk in this derivation. The key point is

$$\|x-\alpha b\|^2=(x-\alpha b)^T(x-\alpha b)=x^Tx-2\alpha x^Tb+\alpha^2b^Tb$$

which is true by distributivity of the dot product over addition, and commutativity.

From this, you have a ordinary minimization problem of a scalar quadratic expression,

$$A\alpha^2-2B\alpha+C,$$ which is known to have an extremum at

$$\alpha=\frac BA.$$