Conjugate gradient algorithm of quadratic function

136 Views Asked by At

My task is to implement the conjgate gradient method with exact directional minimization on quadratic function. $f(x) = $$\frac{1}{2}$$ x^T*A*x-b^T*x$ I am writing a function in matlab. So far I have got:

function [x] = CG(A, b, x_0)
tol=1e-10;
x = x_0;
g = b - A*x;
if norm(g) < tol
    return
end
d = -g; %Special selection of the first direction
B_denum = g' * g;

for i = 1:length(b);
   Ad = A*d;
   alpha = (g'*d)/(d'*Ad);
   x = x + alpha*d;
   g = g - alpha*Ad;
   if( norm(g) < tol )
        return;
   end
   B_num = g' * g;
   d = -g + (B_num/B_denum)*d;  %direction d
   B_denum = B_num;
end
end

Where A is a NxN sparse matrix, b = [1,0,1,0...], x_0 = starting point that is equals to 0. My question is, am I doing everything correctly? Because it is my task for university and my prof. said that something is wrong, but he did not say what...

Do I calculate α (step-length) correctly? Is α same for quadratic and linear functions? I am also not sure about 'g' evalutation.