Deriving the regularization term in bayesian lasso regression

49 Views Asked by At

the title is probably not very good, I thought hard about how to phrase this correctly. I'd be grateful if someone tells me it's wrong and how to correct it.

I am practicing for my exams and I have come across an exam question from last year:

Consider the following Bayesian regression model for fitting a quadratic function,

$$y= (θ_0+θ_1x+θ_2x^2)+\epsilon$$ $$\epsilon∼Normal(0,σ^2)$$ $$\theta ∼ Laplace(0, s) \space i \in {0,1,2}$$

For the given model, MAP estimation may be written as a regularized least-squares optimization problem in the following form

$$\underset{\theta}{\operatorname{argmin}} \sum_{i=1}^n(y_i−hθ(x_i))^2+\lambda C(θ)$$

where $hθ(x_i) = θ_0+θ_1x+θ_2x^2$

what are the correct expressions for $\lambda$ and $C$?

the answer is: $$ \lambda = \frac{2\sigma^2}{s} $$ $$C = \sum_{i=0}^n|\theta_i|$$

I tried this for two days, I did a lot of different ways but I couldn't get the result (apparently it's easy as written in the solution). The closest I got was by assuming y is a Normal variable with standard deviation $\sigma$ and new mean $\theta x$ then taking the derivative and setting the original equation equal to it.

I can't think of anything anymore and that was probably complete ludicrous anyway. My head is quite foggy now. Could someone please help and tell or give me a hint on how I can get to the solution?

thank you very much!!