I have the following expression from Linear Least Square part of Steven M Kay book titled Fundamentals of statistical signal processing.
$\hat{\theta} = \frac{\sum_{n=0}^{N-1}x[n]h[n]}{\sum_{n=0}^{N-1}h^2[n]}$ ---(1)
$\hat{\theta} \sum_{n=0}^{N-1}h[n](x[n]-\hat{\theta} h[n])$ ----(2)
It is written in book that when we substitute $\hat{\theta}$ from equation (1) into equation (2), summation in equation (2) reduces to zero.
However, I am not getting how this is happening. Any help in this regard will be highly appreciated.
The key ideas are that you can "split" the summation and take out $\hat{\theta}$, since it does not depend on $n$.
If you take your second equation, you can rewrite it as: $$ \sum_{n=0}^{N-1}h[n]x[n] - \hat{\theta}\sum_{n=0}^{N-1}h^2[n]. $$ Substituting with (1), you have: $$ \sum_{n=0}^{N-1}h[n]x[n] - \frac{\sum_{n=0}^{N-1}h[n]x[n]}{\sum_{n=0}^{N-1}h^2[n]}\sum_{n=0}^{N-1}h^2[n] = \sum_{n=0}^{N-1}h[n]x[n] - \sum_{n=0}^{N-1}h[n]x[n] = 0. $$