Help with least squares in paper on hough transform

106 Views Asked by At

I'm trying to understand this paper.

I'm having trouble with a least squares problem.

In the paper, equation (4) is

$ E^2 = \sum_{r\in R}{\sum_{c \in C}{[ \alpha r + \beta c + \gamma - I(r,c)]^2 }} $

This represents a least squares fit of a line to some data (in this case, an image I).

$\alpha$ and $\beta$ represent the co-efficients of the line, and $\gamma$ represents Gaussian noise.

The aim is to choose $\alpha$ and $\beta$ so as to minimize the error E.

The paper lists the solutions as

$\alpha = \frac{\sum_{r,c}{rI(r,c)}}{\sum_{r,c}{r^2}}$

and

$\beta = \frac{\sum_{r,c}{cI(r,c)}}{\sum_{r,c}{c^2}}$

I think I have missed something in the derivation.

I have tried differentiating and setting the result to zero, ie:

$\frac{\partial E}{\partial \alpha} = 0$ and $\frac{\partial E}{\partial \beta} = 0$

but I get

$\alpha = \frac{\sum_{r,c}{[rI(r,c) - \beta rc - \gamma r]}}{\sum_{r,c}{r^2}}$

and

$\beta = \frac{\sum_{r,c}{[cI(r,c) - \alpha rc - \gamma c]}}{\sum_{r,c}{c^2}}$

As for the variances and covariance listed in (7), (8) and (9) I haven't the foggiest clue where they came from.

Any help in understanding these would be much appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

I found this paper by the same author where more explanation is given.

The trick was that the author chose a convenient coordinate system where adding up the co-ordinate values sums to zero. (ie: like -3 + -2 + -1 + 0 + 1 + 2 + 3 = 0)

so

$ \sum_{r \in R}{r} = \sum_{c \in C}{c} = 0 $

The same can be done for the $\sum_{r, c}{rc}$ term as this is also zero now.

So that is what happens to those extra terms in my solution, they become zero.