Linearizing two variable function

739 Views Asked by At

I have another linearization question similar to the one in here. This time, I have got two variables in my equation and I am in search of an "$A+B\rho$" or possibly "$A+B\rho+C\theta$" approximation. Here is my equation:

$$W = \frac{\theta}{2(1-\rho)}$$

where $\theta,\rho\in \mathbb{R}^+$ and $\rho\in[0,1)$ i.e., $0\leq \rho <1$.

I tried to come up with "$A+B\rho$", although I feel like the correct form of the linearization should be "$A+B\rho+C\theta$". I followed Leibovici's linear regression method with Taylor series.

I minimized the norm:

$$F = \int_a^b \left(A + B \rho - \frac{\theta}{2(1-\rho)}\right)^{2}$$

After integration, I came up with the following two equations:

$\frac{\partial W}{\partial A} = - 2 A a + 2 A b - B a^{2} + B b^{2} - \theta \log{\left (a - 1 \right )} + \theta \log{\left (b - 1 \right )}$ $\frac{\partial W}{\partial B} = - A a^{2} + A b^{2} - \frac{2 B a^{3}}{3} + \frac{2 B b^{3}}{3} - a \theta + b \theta - \theta \log{\left (a - 1 \right )} + \theta \log{\left (b - 1 \right )}$

Setting $a=0.0$ and $b=0.1$, I came up with the following approximation (which is still nonlinear):

$W\approx 0.499055\theta + 0.554939\theta\rho$

I do not know if this makes life easier or not, but, we have the following relationship between $\rho$ and $\theta$:

$$\rho = \sum_{j\in J} \frac{\lambda_j}{\mu_j}$$ and $$\theta = \sum_{j\in J} \frac{\lambda_j}{\mu_j^2}$$

Additionally, I am not really in search of a Newton/Newton-Raphson linearization, as I believe a linear line with a single point approximation does not satisfactorily represent the curve in this case. Considering $\theta\in \mathbb{R}^+$, I do not think, Newton derived methods would help me.

Any recommendation is appreciated.

1

There are 1 best solutions below

1
On BEST ANSWER

Since nos you face a surface, we can consider that we need to minimize $$G = \int_a^b \int_c^d\left(A + B \rho+C \theta - \frac{\theta}{2(1-\rho)}\right)^{2}\,d\rho\,d\theta$$ with respect to $A,B,C$.

I shall not reproduce here the analytical expression of neither $G$ or the partial derivatives $\frac{\partial G}{\partial A}$, $\frac{\partial G}{\partial B}$, $\frac{\partial G}{\partial C}$ (they are really messy) but the solutions are quite simple (I did not finish the simplifications).

$$4(a-b)^3 A=-3 (a+b) (c+d) ((a+b-2) \log (a-1)-(a+b-2) \log (b-1)-2 a+2 b)$$ $$2(a-b)^3 B=3 (c+d) ((a+b-2) \log (a-1)-(a+b-2) \log (b-1)-2 a+2 b)$$ $$2(a-b) C=\log (b-1)-\log (a-1)$$

Using $a=0$, $b=\frac 1{10}$, $c=\frac 9{10}$, $d=\frac {11}{10}$, this would lead to $$A=30-285 \log \left(\frac{10}{9}\right)\qquad B=300 \left(19 \log \left(\frac{10}{9}\right)-2\right)\qquad C=5 \log \left(\frac{10}{9}\right)$$ that is to say $$A\approx -0.027747 \qquad B \approx 0.554939 \qquad C \approx 0.526803$$

In order to check the validity of the results, I generated a data set of $\frac{\theta}{2(1-\rho)}$ with steps $\Delta \rho=\Delta \theta=\frac{1}{100}$ between the selected bounds.

A linear regression gave the following results $$\begin{array}{clclclclc} \text{} & \text{Estimate} & \text{Standard Error} & \text{Confidence Interval} \\ A & -0.027756 & 0.00130 & \{-0.030326,-0.025186\} \\ B & +0.555113 & 0.00248 & \{+0.550223,+0.560002\} \\ C & +0.526900 & 0.00130 & \{+0.524346,+0.529454\} \\ \end{array}$$ which seems to confirm.