Least squares fitting issue

141 Views Asked by At

I deal with MRI image processing and while reading one of the articles in this field I faced with the next mathematical formula: $$ \widetilde{R_2}(t) = K_1*\overline{R_2}(t) + K_2 * \int_0^t \!\overline{R_2}(t') \mathrm{d}t'$$ (see formula A9 in article: http://www.ajnr.org/content/27/4/859.long )

After that formula the next phrase is stated: The right-hand-side of Equation has 2 unknowns, $K_1$ and $K_2$, multiplying the (measured) brain-averaged log-signal change and its time integral. $K_1$ and $K_2$ can be determined by simple linear least-squares fitting, and then a corrected $\widetilde{R_2}(t)$ can be computed.

The problem is that I can't understand how the linear least squares method can be used here. Can anyone please give me advise, how can I use here linear least squares for determining the $K_1$ and $K_2$?

2

There are 2 best solutions below

0
On BEST ANSWER

Changing notation, suppose we want to find numbers $x_1$ and $x_2$ such that: \begin{align*} a_{11} x_1 + a_{12} x_2 &= b_1 \\ a_{21} x_1 + a_{22} x_2 &= b_2 \\ \vdots & \vdots \\ a_{N1} x_1 + a_{N2} x_2 &= b_N. \end{align*} The numbers $a_{ij}$ and $b_i$ are assumed to be known.

Using matrix notation, we can express our problem as finding \begin{equation*}x = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} \end{equation*} such that \begin{equation*} Ax = b, \end{equation*} where \begin{equation*} A = \begin{bmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \\ \vdots & \vdots \\ a_{N1} & a_{N2} \end{bmatrix}, \quad b = \begin{bmatrix} b_1 \\ b_2 \\ \vdots \\ b_N \end{bmatrix}. \end{equation*} Because we have more equations than unknowns, our system is "overdetermined" and likely has no solution. In this situation, we often try to find a "least squares solution", which means finding the vector $x$ which minimizes $\|Ax - b \|^2$. The idea is that if we can't make the residual $Ax - b$ equal to $0$, we can at least try to make the residual as small as possible.

Solving least squares problems like this is a standard topic in linear algebra. One way to do it is to solve the "normal equations" \begin{equation*} A^T A x = A^T b. \end{equation*}

0
On

A quadratic regression is the process of finding the equation of the parabola that best fits a set of data. As a result, we get an equation of the form:

$$y=ax^2+bx+c$$ where $a\ne 0$ .

The best way to find this equation manually is by using the least squares method. That is, we need to find the values of $a, b,$  and  $c$ such that the squared vertical distance between each point $(x_i,y_i)$ and the quadratic curve $y=ax^2+bx+c$ is minimal.