Improved Gradient Algorithm

25 Views Asked by At

Show that asymptotic stability of the adaptation error is assured for $0<\gamma<2, \alpha>0$.

$$ \hat{\theta}(t + 1) = \hat{\theta}(t) + \frac{\gamma \phi(t)\varepsilon^{0}(t + 1)}{\alpha + \phi^T(t)\phi(t)} $$

Any ideas? Would someone give me some references link?

Thank you.