Poisson regression gradient descent constant step size

179 Views Asked by At

For $x_1,...,x_n \in \mathbb{R}$, $y_i \sim \text{Poi}(\exp(\beta x_i))$, the log-likelihood of $\beta$ is given by:

$\mathcal{l}(\beta) = \sum^n_{i = 1}\exp(\beta x_i - y_i x_i \beta)$

We would like to minimize the above function using the gradient descent method, i.e. generate a sequence $\beta_{k+1} = \beta_k - \gamma \mathcal{l'}(\beta_k)$ such that $\lim_{k \to \infty}\beta_k$ is the minimizer of $\mathcal{l}(\beta)$.

I need to show that there is a fixed constant $\gamma$ such that $\exists \alpha \in (0,1)$ s.t $ \forall k \geq 1$ $|\beta_k - \hat{\beta}| \leq \alpha^k|\beta_0 - \hat{\beta}|$

So far, I know that this function is convex, but not globally Lipschitz, so the vanilla gradient descent doesn't directly apply, and I'm not sure where to start.