Make a square matrix singular through differentiation

180 Views Asked by At

I have the following $4 \times 4$ matrix

$$ M = \sum_{i=1}^n P_i^T \left(I_{3x3} - \frac{x_ix_i^T}{x_i^Tx_i}\right)P_i $$

where $P_i$ and $x_i$ are $3 \times 4$ and $3 \times 1$, respectively. The $P_i$'s are camera perspective projection matrices. My goal is to optimize the $x_i$'s locally to make matrix $M$ singular.

The first approach I thought of was to try and express $M$'s determinant and then find its derivatives as a function of the $x_i$'s but this doesn't seem to be analytically doable.

1

There are 1 best solutions below

1
On BEST ANSWER

Let $\pi_i=(I_{3x3} - \frac{x_ix_i^T}{x_i^Tx_i})$; it's the orthogonal projection on ${x_i}^{\perp}$; thus $\pi_i$ has rank $2$ and $M_i=P_i^T\pi_iP_i$ too (because $P_i$ is surjective ad $P_i^T$ is one to one). Finally $M$ is the sum of $2,3$ or $4$ such matrices $M_i$ and, therefore, has rank $4$ in general position. Note that we may assume that the $(x_i)_i$ have a norm equal to $1$.

Let $\det(M)=f((x_i)_i)=\sum_iP_i^T(I-x_ix_i^T)P_i$.

The question you ask can be rewritten as follows: let $(h_i)_{i\leq n}$ be $3\times 1$ unknown vectors.

Problem. Find the minimum of $\sum_i||h_i||^2$ under the conditions

$||x_i+h_i||=1,i\leq n$ and $f((x_i+h_i)_i)=0$.

The idea is to construct an iterative algorithm, replacing the function $f$ with $f(x_i)+Df_{x_i}(h_i)$. To do that, we use the Maple NLPSolve; of course, we must change the constraint $f(x_i+h_i)=0$ with $f(x_i+h_i)=kf(x_i)$ where $k\in(0,1)$; one must write a little procedure that calculates a "good" $k$ (begin, for example with $k=0.1$ and while NLPSolve gives a result, increase $k$). So we are getting closer to a good solution; less than $10$ iterations suffice in general.

That follows (if you are interested) is the procedure I wrote in Maple for $n=3$; I did'nt write explicitly the cycle "od..do".

restart;
with(LinearAlgebra);
P1 := RandomMatrix(3, 4); P2 := RandomMatrix(3, 4); P3 := RandomMatrix(3, 4);

 W := NULL;

*values at time 0 of  y1,y2,y3 (they play the role of the x_i)

A := Vector(3, symbol = a); B := Vector(3, symbol = b); C := Vector(3, symbol = c); 

id := IdentityMatrix(3);

 y1 := RandomVector(3); y1 := evalf(y1/Norm(y1, 2)); y2 := RandomVector(3); y2 := evalf(y2/Norm(y2, 2)); y3 := RandomVector(3); y3 := evalf(y3/Norm(y3, 2));

*The beginning of the cycle "do..od"

A := Vector(3, symbol = a); B := Vector(3, symbol = b); C := Vector(3, symbol = c); x1 := y1+A; x2 := y2+B; x3 := y3+C;

L := [a[1] = 0, a[2] = 0, a[3] = 0, b[1] = 0, b[2] = 0, b[3] = 0, c[1] = 0, c[2] = 0, c[3] = 0];

res1 := Transpose(P1) . (id-x1 . Transpose(x1)) . P1; res2 := Transpose(P2) . (id-x2 . Transpose(x2)) . P2; res3 := Transpose(P3) . (id-x3 . Transpose(x3)) . P3;

res := evalf(Determinant(res1+res2+res3));

Digits := 20;

 nu := evalf(10^(-13)*(eval(res, L)+(eval(diff(res, a[1]), L))*a[1]+(eval(diff(res, a[2]), L))*a[2]+(eval(diff(res, a[3]), L))*a[3]+(eval(diff(res, b[1]), L))*b[1]+(eval(diff(res, b[2]), L))*b[2]+(eval(diff(res, b[3]), L))*b[3]+(eval(diff(res, c[1]), L))*c[1]+(eval(diff(res, c[2]), L))*c[2]+(eval(diff(res, c[3]), L))*c[3]));

s := eval(nu, L); W := W, s;

with(Optimization);

rr := NLPSolve(a[1]^2+a[2]^2+a[3]^2+b[1]^2+b[2]^2+b[3]^2+c[1]^2+c[2]^2+c[3]^2, {nu = s-0.16e-3, Transpose(x1) . x1 = 1, Transpose(x2) . x2 = 1, Transpose(x3) . x3 = 1}, initialpoint = {a[1] = 0, a[2] = 0, a[3] = 0, b[1] = 0, b[2] = 0, b[3] = 0, c[1] = 0, c[2] = 0, c[3] = 0}, iterationlimit = 10000);

y1 := y1+eval(A, rr[2]); y2 := y2+eval(B, rr[2]); y3 := y3+eval(C, rr[2]);

Norm(y1, 2); Norm(y2, 2); Norm(y3, 2);

*the end of the cycle "do..od"`