As the question states, given a sum-function :
$$f(x) = \sum_{ij}\left({\sqrt{(x_{ij} - y_{ij})^2+1}}+\frac{1}{2}\sqrt{(x_{ij}-x_{i+1j})^2+(x_{ij}-x_{ij+1})^2 +1}\right)$$
where $x_{ij} $ describes the optimized image , $y_{ij}$ the noisy image (assuming the images are presented as pixel-arrays) and the function describes an optimization's problem.
how to find the gradient of this function ?
I gave it already a try and my result is :
$$\nabla f = \sum_{ij}\left(\frac{x_{ij} - y_{ij}}{{\sqrt{(x_{ij} - y_{ij})^2+1}}} + \frac{2(x_{ij}-x_{i+1j}) + 2(x_{ij}-x_{ij+1})}{\sqrt{(x_{ij}-x_{i+1j})^2+(x_{ij}-x_{ij+1})^2 +1}}\right)$$
Is it right ? and given the following code for finding a the gradient descent how could i find the gradient descent ? what would my curr_x be (assuming $x = y$ in the beginning ) ?
Code :
cur_x = # The algorithm starts at x=6
gamma = 0.01 # step size multiplier
precision = 0.00001
previous_step_size = 1
max_iters = 10000 # maximum number of iterations
iters = 0 #iteration counter
df = lambda (function)
while (previous_step_size > precision) & (iters < max_iters):
prev_x = cur_x
cur_x -= gamma * df(prev_x)
previous_step_size = abs(cur_x - prev_x)
iters+=1
print("The local minimum occurs at", cur_x)