Understand and Implement adjoint operator of gradient

1.1k Views Asked by At

I have an equation as enter image description here

With initial ui=vi; yi=0, tau=1; enter image description here

I am implementing eq. 4.19 and 4.20 in MATLAB. Hence, I would like to ask you something.

  1. The adjoint operator of gradient is similar the divergence operator, Is it right?

To explain what I can understand. I show it in my matlab code. Could you verify my code is correct or not?

Code:

%% Initial value
N=2; %i=1..2
%% Initial ui s.t ui>0 and \sum ui=1
u=rand(128,128,N);
a=sum(u,3);
for k = 1 : N
    u(:,:,k)=u(:,:,k)./a;
    v(:,:,k)=u(:,:,k);
end
tau=1;
theta=0.1;
y1 = zeros([size(u(:,:,1)) 2]);
y2 = zeros([size(u(:,:,1)) 2]);
for i=1:N_class
   [dxu dyu] = grad(u(:,:,i));
   %% Equation 4.19
   Nu1 = y1(:,:,i)+tau*dxu;
   Nu2 = y2(:,:,i)+tau*dyu;
   Nu  = sqrt(Nu1.^2+Nu2.^2);
   min_Nu=min(1/tau,Nu);
   y1(:,:,i) = min_Nu.*(y1(:,:,i) + tau*dxu)./Nu;
   y2(:,:,i) = min_Nu.*(y2(:,:,i) + tau*dyu)./Nu;
   %% Equation 4.20
   v(:,:,i)=u(:,:,i)-theta*div(y1(:,:,i),y2(:,:,i));
end

UPDATE div and grad function This is my discrete gradient operator

This is my div and grad function function

function [dxu dyu] = grad(u)
% [dxu dyu] = grad(u) computes the gradient of the function u
% with forward differences assuming
% Neumann boundary conditions    
[m n] = size(u);    
dxu = zeros(m,n);
dyu = zeros(m,n);
dxu(:,1:(n-1)) =  u(:,2:n) - u(:,1:(n-1));
dyu(1:(m-1),:) =  u(2:m,:) - u(1:(m-1),:);

function D = div(v1,v2)
% D = div(v1,v2) computes the divergence of the vector-
% field (v1, v2)^T with backward differences assuming
% Neumann boundary conditions
[m n] = size(v1);    
dxv1 = zeros(m,n);
dyv2 = zeros(m,n);    
dxv1(:,2:n) = v1(:,2:n) - v1(:,1:(n-1));    
dyv2(2:m,:) = v2(2:m,:) - v2(1:(m-1),:);
dxv1(:,1) = v1(:,1);% - 0
dyv2(1,:) = v2(1,:);% - 0    
%compute the divergence
D = dxv1 + dyv2;

Reference: Section 4.2-4.3 in the paper FUZZY MUMFORD–SHAH MODEL