Is this system of equations usable in linear regression?

100 Views Asked by At

I have a set of equations that I want to transform so that I may be able use linear regression to predict constant D:

Equation 1: $$ y = 1 -\left(\frac{6Sh^{2}}{\beta_{n}^{2}\left(\beta_{n}^{2} + Sh^{2} - Sh\right)}\right)e^{ -\frac{\beta_{n}^{2}}{R^{2}}Dt} $$ Where βn is the first root of the equation: $$ \beta_n cot⁡(\beta_n ) = 1 - Sh $$

And $$ Sh = hR/D. $$ Originally, equation 1 includes a summation for the second term of all βn roots, but numerically the 2nd+ roots can be estimated to be 0, leaving us with this simplified equation.

Known variables = h (constant), R (constant), t (independent variable, time), and y is the given dependent variable on (0,1).

I have experience with transforming linear equations for prediction using linear regression, but I am unsure if it is even possible to do so in this case given the definition of βn relying on the constant D, which is to be estimated via regression.

Any insight would be greatly appreciated, thank you.

3

There are 3 best solutions below

1
On

Please, tell me if I misunderstood or if I made any mistake in this preliminary answer. This problem is very interesting to me.

If I properly understand, you have $n$ data points $(t_i,y_i)$, $h$ and $R$ being constants and you need to ajust the value of parameter $D$.

The problem does not look simple at all because of the $\beta$ paameter which is an implicit function of $D$.

I suppose that you will agree that the problem is simple if $\beta$ is known. What I would suggest is the following : give $D$ a guessed value $D_0$ which which you compute $\beta_0$; from this $\beta_0$, compute $D_1$ and repeat until convergence of $D$ computed by linear or nonlinear regression.

Assuming $0 \leq \beta \leq \frac \pi 2$, solving the equation $\beta \cot(\beta)=1-Sh$ is not a big problem since $$\beta \cot(\beta)\sim g(\beta)=\frac{24 \pi ^2+\pi\left(\pi ^4-96 \right) \beta +\left(96-2 \pi ^4\right) \beta ^2}{24 \pi ^2+\pi\left(\pi ^4-96 \right) \beta +\left(96+8 \pi ^2-2 \pi ^4\right) \beta ^2}\tag 1$$

To give an idea of the quality of the approximation $$\int_0^{\frac \pi 2} \Big[\beta \cot(\beta)- g(\beta)\Big]^2\, d\beta=9.98\times 10^{-7}$$ whith a maximum absolute error of $0.001342$ around $\beta=1.01857$

which gives, as a very close approximation $$\beta_0=\frac{48 \pi Sh}{\left(96-\pi ^4\right) Sh+\pi \sqrt{\left(\pi ^6-768\right) Sh^2+768 Sh}}$$

To polish the root, I suppose that a single interation of Newton method would be sufficient $$\beta_1=\beta_0-\frac {\beta_0 \cot (\beta_0 )+Sh-1 } {\cot (\beta_0 )-\beta_0 \csc ^2(\beta_0 ) }$$

Using this $\beta_1$, compute $D_1$ since at this point the model is just $$y=1- A e^{-B t}$$ easy to linearize. However, I do not recommend you to do so, (except in a preliminary step to get estimates of $(A,B)$) since what is measured is $y_i$ and not $\log(1-y_i$).

3
On

You may handle this problem by working on the unknonw $\beta_n$ rather than $D$. We know that

$$D=\frac{hR}{1-\beta_n cot⁡(\beta_n )}$$ and $$S_h=1-\beta_n cot⁡(\beta_n ).$$

So we rewrite

$$y = 1 -\left(\frac{6(1-\beta_n cot⁡(\beta_n ))^{2}}{\beta_{n}^{2}\left(\beta_{n}^{2} + (1-\beta_n cot⁡(\beta_n ))^{2} - (1-\beta_n cot⁡(\beta_n ))\right)}\right)e^{ -\frac{h\beta_{n}^{2}}{R(1-\beta_n cot⁡(\beta_n ))}t}.$$

Though this expression is a little monstrous and impossible to linearize, it works around the problem of the implicit relation. Now you can set up a least squares minimization problem, which is univariate.

0
On

Based on the $n$ data points $(t_i,y_i)$, for a given criteria, there is a unique fitting function $$y=1-A\, e^{-B\, t}$$ for which the parameters are easily obtained. So, at solution, we have $$A=\frac{6Sh^{2}}{\beta^{2}\left(\beta^{2} + Sh^{2} - Sh\right)}\qquad \text{and} \qquad B=\frac{\beta^{2}}{R^{2}}D$$ $$B=\frac{\beta^{2}}{R^{2}}D \implies \beta=R \sqrt{\frac B D}$$ So, we just need to solve for $D$ the equation $$R \sqrt{\frac B D} \cot\Bigg[R \sqrt{\frac B D} \Bigg]=1-\frac {h R}D$$ that is to say $$D=\frac {B\,R^2}{x^2} \implies \qquad \color{red}{x\cot(x)=1-k x^2}\qquad \text{with} \qquad k=\frac{h}{B\,R}$$

We can approximate the solution of $x$ solving for $x$ $$k=\frac{(3-\pi^2)x+\pi^3 }{3\pi^2(\pi-x)}\implies \color{blue}{x_0=\frac{(3k-1)\pi^3 } {(3k-1)\pi^2+3 }}$$ As an indication about the quality of the approximation $$\int_0^\pi \Bigg[\frac{1-x \cot (x)}{x^2}-\frac{(3-\pi^2)x+\pi^3 }{3\pi^2(\pi-x)} \Bigg]^2\,dx=0.00218$$

The below table gives values of $x_0$, $x_1$ (the first iterate of Newton method applied to the "red" equation) and the solution for a few values of $k$ $$\left( \begin{array}{cccc} k & x_0 & x_1 & \text{solution} \\ 0.5 & 1.95382 & 2.12250 & 2.08158 \\ 1.0 & 2.72712 & 2.74462 & 2.74371 \\ 1.5 & 2.89056 & 2.89677 & 2.89659 \\ 2.0 & 2.96155 & 2.96470 & 2.96464 \\ 2.5 & 3.00124 & 3.00314 & 3.00311 \\ 3.0 & 3.02660 & 3.02786 & 3.02784 \end{array} \right)$$

Edit

Admitting small errors on the data, select two points $(t_1,y_1)$ and $(t_2,y_2)$. Then $$k=\frac{\log \left(\frac{1-y_2}{1-y_1}\right)}{t_1-t_2}=\frac{\beta^2}{R^2}D$$

So, we have two equations $$\beta=R \sqrt{\frac k D}$$ $$\beta\cot(\beta)=1-\frac {h R}D$$ which lead to $$\beta=\cot ^{-1}\left(\frac{R (D-h R)}{\sqrt{D k}}\right)$$