How to choose $\alpha$ and $\beta$ in backtracking line search method in convex optimization?

515 Views Asked by At

I'm using quasi newton method with BFGS update in a convex optimization algorithm,during debugging my algorithm I've noticed that different magnitude of $\alpha$ and $\beta$ has a great effect on convergence acceleration.As best choice of $\alpha$ and $\beta$ varies for different kinds of optimization problems, I have to choose $\alpha$ and $\beta$ for different optimization problems.Can anyone give me some hints on choosing best $\alpha$ and $\beta$?What do they depend on? The following code segment declares what I mean by $\alpha$ and $\beta$

%% Line search

while(f(x+t*delta) > (f(x) + alpha*t*(gradf'*delta)))
    t=beta*t;
end

x=x+t.*delta;