Solving Series of equations

158 Views Asked by At

I have the following series of equations ($n+2$ equations $n+2$ variables):

\begin{equation*} k_0q_0+\lambda q_0 + c_0 = 0, \\ k_1q_1+\lambda q_1 + c_1 = 0, \\ k_nq_n+\lambda q_n + c_n = 0, \\ q_1+q_2+....+q_n = 1. \end{equation*}

The variables are $q_0,q_1,.....,q_n$ and $\lambda$. Note that $k$ and $c$ are series of constants.

How can I solve this?

3

There are 3 best solutions below

3
On BEST ANSWER

$\displaystyle q_i = \frac{-c_i}{k_i + \lambda}$

Substituting in the last equation gives us

$\displaystyle \sum_{i=1}^{n} \frac{-c_i}{k_i + \lambda} = 1$

which can be made into a polynomial equation in $\displaystyle \lambda$, which you should be able to solve by standard numerical methods.

Once you find a root of the above polynomial, substituting $\displaystyle \lambda$ with the value of the root in $\displaystyle q_i = \frac{-c_i}{k_i + \lambda}$ will give the other variables.

2
On

It's not linear, first of all: you have two variables $\lambda$ and $q_i$ multiplied together. Luckily, your set of equations is algebraic; Gröbner may have a shot here.

0
On

Define $c_{n+1} = 1$ and $q_{n+1} = 0$. Observe that we can write the above system in the following $(n+2) \times (n + 2)$-matrix form: $K_{\lambda} \mathbf{q} = - \mathbf{c}$, where $K_{\lambda}$ is $\text{diag}(k_{i} + \lambda)$ for the upper $(n+1) \times (n+1)$-submatrix, ($0$ $1$ $1$ $\cdots$ $1$ $0$) for the bottom row and ($0$ $0$ $0$ $\cdots$ $0$ $1$)$^{\top}$ for the right most column. (By enlarging the matrix, we have encoded the constraint $\sum_{i = 1}^{n} q_{i} = 1$.)

Since $\lambda$ is unknown, we can proceed presupposing that $K_{\lambda}$ is not singular (i.e., $\det K_{\lambda} \neq 0$), and solve the constrained system by matrix inversion: $\mathbf{q} = - K_{\lambda}^{-1} \mathbf{c}$. The result is a vector $\mathbf{q}$ in terms of the unknown $\lambda$ and given constants. To find the allowed values for $\lambda$, numerically (or analytically) solve the polynomial $\mathbf{q} \cdot \mathbf{1} - 1 = 0$.