Gaussian elimination to solve half of vector values while the other values are varied

94 Views Asked by At

I am trying to solve a linear system using Gaussian elimination where a half of variables are varied and I need to solve the system to adapt the other half of variables in order to get the system correct.

For example, I have the the system $W*x = y$ where $W$ is known matrix and $y$ is known too, and half of the vector $x$ is varied, it mean it can by any random values, what I need is to set the other half of the vector $d$ to get the correct equality of the system.

Example:

I have the system as following:

enter image description here

Suppose that the values of $x_1$ and $x_2$ are varied and can be any random values, for example $1$ and $-1$, respectively. Therefore the above equation will be written as below:

enter image description here

I need to solve $k_1$ and $k_2$ to get the above equation right. Normally, as I know, we can do it as following:

enter image description here

Which gives:

enter image description here

Finally, that can be written as:

enter image description here

So, I cannot solve $k_1$ and $k_2$ in that case to solve the system in the first equation!!

My question, Is it possible to solve the variables $k_1$ and $k_2$ to make the equality right ?If not, is there any other algorithm we can use it instead of Gaussian elimination to solve it?

Another example

$W = [0.5+0.5i , 0.5-0.5i; 0.5-0.5i, 0.5+0.5i]$, and multiplied by a vector $[0.7; k]$; the resultant vector should be $y = [1; -1]$. I think the value of $k$ which makes the multiplication correct should exist, is that right ? but how can we find it mathematically?

1

There are 1 best solutions below

4
On BEST ANSWER

$\def\m#1{ \left[\begin{array}{r}#1\end{array}\right] }$The full system is $$Aw=b$$ Without doing any math, notice that the columns of $A$ are all orthogonal to one another (i.e. it has full rank), while $b$ is equal to the ${\tt2}^{nd}$ column and therefore the unique solution vector for the full system is the ${\tt2}^{nd}$ basis vector, i.e. $$w = \m{0\\1\\0\\0}$$ Writing the system in partitioned form $$ \big[V\quad W\big]\,\m{x\\k} = b \qquad\implies\quad Wk = (b-Vx) $$ does not change the solution. But choosing $x\ne\m{0\\1}$ makes the system inconsistent.

The best one can do in such circumstances is a least-squares solution $$k = W^+(b-Vx) + (I-W^+W)y$$ where $W^+$ is the Moore-Penrose inverse and $y$ is an arbitrary vector.

For the given matrix $\quad W^+=\frac 14W^T$