rank of a matrix with two columns s.t. their dot product is zero

55 Views Asked by At

I have function $\sigma(u,v)=(f(u,v),g(u,v),h(u,v))$ s.t. $\sigma_u$ x $\sigma_v\neq(0,0,0)$ (cross-product)

also, there is the $3\times 2$ matrix : $$ \begin{bmatrix} f_u & f_v \\ g_u & g_v \\ h_u & h_v \\ \end{bmatrix}$$

also $\sigma_u(0,v)$ and $\sigma_v(0,v)$ are penpendicular.

the matrix has rank 2(when $u=v=0$). why? could you please help.

($f_u$ means derivative of $f$ wrt. $u$ )

2

There are 2 best solutions below

0
On

If the matrix had rank $1$, we would then have $$f_v = a f_u; \,\,g_v = a g_u; \,\,h_v = a h_u$$ But since $\sigma_u(0,v)$ and $\sigma_v(0,v)$ are perpendicular, this would mean $a=0$, i.e., $f_v=g_v = h_v =0$ at $(0,v)$ or $f_u^2+g_u^2+h_u^2 = 0$ at $(0,v)$, i.e., $f_u=g_u = h_u =0$ at $(0,v)$. Both of which, I assume, doesn't make sense in the context of your problem.

0
On

The matrix has rank $2$ at $(0, v)$ if $\sigma_v \ne 0 \ne \sigma_u$ there. For we have $\sigma_u \bot \sigma_v$ there, or, algebraically, $\sigma_u \cdot \sigma_v = 0$. If these two vectors were linearly dependent, then we would have

$a\sigma_u + b\sigma_v = 0 \tag{1}$

with $a \ne 0 \ne b$. Taking the dot product with $\sigma_u$ yields

$0 = a \sigma_u \cdot \sigma_u + \sigma_v \cdot \sigma_u = a\sigma_u \cdot \sigma_u; \tag{2}$

since $\sigma_u \ne 0$ this forces $a = 0$; a similar argument shows $b = 0$. But this conclusion contradicts the hypothesis placed on $a, b$. Thus $\sigma_u, \sigma_v$ are linearly independent, and the matrix has rank $2$ at $(0, v)$. Away from $(0, v)$ what happens is anybody's guess, though by the continuity is $\sigma_u$ and $\sigma_v$ we can affirm linear independence in some neighborhood of the set of points $(0, v)$.

Hope this helps. Cheers,

and as always,

Fiat Lux!!!