How to extract $(x+y+z)$ or $xyz$ from the determinant

530 Views Asked by At

Prove $$\color{blue}{ \Delta=\begin{vmatrix} (y+z)^2&xy&zx\\ xy&(x+z)^2&yz\\ xz&yz&(x+y)^2 \end{vmatrix}=2xyz(x+y+z)^3} $$ using elementary operations and the properties of the determinants without expanding.

My Attempt $$ \Delta\stackrel{C_1\rightarrow C_1+C_2+C_3}{=}\begin{vmatrix} xy+y^2+yz+zx+yz+z^2&xy&zx\\ x^2+xy+xz+xz+yz+z^2&(x+z)^2&yz\\ x^2+xy+xz+xy+y^2+yz&yz&(x+y)^2\\ \end{vmatrix}\\ =\begin{vmatrix} y(x+y+z)+z(x+y+z)&xy&zx\\ x(x+y+z)+z(x+y+z)&(x+z)^2&yz\\ x(x+y+z)+y(x+y+z)&yz&(x+y)^2\\ \end{vmatrix}\\ =(x+y+z)\begin{vmatrix} y+z&xy&zx\\ x+z&(x+z)^2&yz\\ x+y&yz&(x+y)^2\\ \end{vmatrix}\\ \stackrel{R_1\rightarrow R_1+R_2+R_3}{=} \\(x+y+z)\begin{vmatrix} 2(x+y+z)&x^2+xy+xz+xz+yz+z^2&x^2+xy+zx+xy+y^2+yz\\ x+z&(x+z)^2&yz\\ x+y&yz&(x+y)^2\\ \end{vmatrix}\\ =(x+y+z)^2\begin{vmatrix} 2&x+z&x+y\\ x+z&(x+z)^2&yz\\ x+y&yz&(x+y)^2\\ \end{vmatrix}\\ =2(x+y+z)^2\begin{vmatrix} 1&\frac{x+z}{2}&\frac{x+y}{2}\\ x+z&(x+z)^2&yz\\ x+y&yz&(x+y)^2 \end{vmatrix} \stackrel{R_2\rightarrow R_2-(x+z)R_1|R_2\rightarrow R_3-(x+y)R_1}{=} 2(x+y+z)^2\begin{vmatrix} 1&\frac{x+z}{2}&\frac{x+y}{2}\\ 0&\frac{(x+z)^2}{2}&\frac{yz-x^2-xy-xz}{2}\\ 0&\frac{yz-x^2-xy-xz}{2}&\frac{(x+y)^2}{2} \end{vmatrix} $$ How do I extract $x+y+z$ or $xyz$ from the determinant to find the solution ?

Note: In a similar problem How to solve this determinant there seems to be solutions talking about factor theorem and polynomials. I am basically looking for extracting the terms $(x+y+z)$ or $xyz$ from the given determinant to solve it only using the basic properties of determinants.

4

There are 4 best solutions below

0
On BEST ANSWER

$$ \Delta=\begin{vmatrix} (y+z)^2&xy&zx\\ xy&(x+z)^2&yz\\ xz&yz&(x+y)^2 \end{vmatrix}=xyz\begin{vmatrix} \frac{(y+z)^2}{x}&x&x\\ y&\frac{(x+z)^2}{y}&y\\ z&z&\frac{(x+y)^2}{z} \end{vmatrix}= xyz\begin{vmatrix} \frac{(y+z)^2}{x}-x&x&0\\ y-\frac{(x+z)^2}{y}&\frac{(x+z)^2}{y}&y-\frac{(x+z)^2}{y}\\ 0&z&\frac{(x+y)^2}{z}-z \end{vmatrix}= xyz\begin{vmatrix} \frac{(y+z)^2-x^2}{x}&x&0\\ \frac{y^2-(x+z)^2}{y}&\frac{(x+z)^2}{y}&\frac{y^2-(x+z)^2}{y}\\ 0&z&\frac{(x+y)^2}{z} \end{vmatrix}= xyz\begin{vmatrix} \frac{[(y+z)-x][(y+z)+x]}{x}&x&0\\ \frac{[y-(x+z)][y+(x+z)]}{y}&\frac{(x+z)^2}{y}&\frac{[y-(x+z)][y+(x+z)]}{y}\\ 0&z&\frac{[(x+y)-z][(x+y)+z]}{z} \end{vmatrix}= xyz(x+y+z)^2\begin{vmatrix} \frac{[(y+z)-x]}{x}&x&0\\ \frac{[y-(x+z)]}{y}&\frac{(x+z)^2}{y}&\frac{[y-(x+z)]}{y}\\ 0&z&\frac{[(x+y)-z]}{z} \end{vmatrix}= (x+y+z)^2\begin{vmatrix} {[(y+z)-x]}&x^2&0\\ {[y-(x+z)]}&{(x+z)^2}&{[y-(x+z)]}\\ 0&z^2&{[(x+y)-z]} \end{vmatrix} =(x+y+z)^2\begin{vmatrix} {[(y+z)-x]}&x^2&0\\ {[y-(x+z)]}&{(x+z)^2}&{[y-(x+z)]}\\ 0&z^2&{[(x+y)-z]} \end{vmatrix}= (x+y+z)^2\begin{vmatrix} {[(y+z)-x]}&x^2&0\\ -2z&2xz&-2x\\ 0&z^2&{[(x+y)-z]} \end{vmatrix}= (x+y+z)^2\begin{vmatrix} {(y+z)}&x^2&\frac{x^2}{z}\\ 0&2xz&0\\ \frac{z^2}{x}&z^2&{(x+y)} \end{vmatrix}\\ =(x+y+z)^2.2xz.\Big[(x+y)(y+z)-\frac{x^2.z^2}{xz}\Big]\\ =(x+y+z)^2(2xz)(xy+xz+y^2+xz-xz)\\ =(x+y+z)^2(2xz)(xy+y^2+xz)\\=(x+y+z)^2.2xz.y(x+y+z)=2xyz(x+y+z)^2 $$

13
On

The determinant must be a polynomial in $x,y,z$ of degree $6$.

If you set $x=0$,

$$ \begin{vmatrix} (y+z)^2&0&0\\ 0&z^2&yz\\ 0&yz&y^2 \end{vmatrix}=0 $$

so that $x$ is a factor. And by symmetry, $xyz$ as well.

Then with $x=-y-z$, $$ \begin{vmatrix} (y+z)^2&xy&zx\\ xy&y^2&yz\\ xz&yz&z^2 \end{vmatrix}=0 $$ (the bottom six minors are zero), and $x+y+z$ is a factor.

To reach the sixth degree, we are missing two other linear factors. But by symmetry, they cannot be other than $x+y+z$ both (two factors alone cannot sustain a permutation of the variables).

Remains to find the global factor, for example from

$$ \begin{vmatrix} 4&1&1\\ 1&4&1\\ 1&1&4 \end{vmatrix}=\lambda\cdot1\cdot1\cdot1\cdot(1+1+1)^3.$$

2
On

It is easy to calculate \begin{vmatrix} 1&\frac{x+z}{2}&\frac{x+y}{2}\\ 0&\frac{(x+z)^2}{2}&\frac{yz-x^2-xy-xz}{2}\\ 0&\frac{yz-x^2-xy-xz}{2}&\frac{(x+y)^2}{2} \end{vmatrix} If you still want to use extract $x+y+z$, you can use \begin{eqnarray} \begin{vmatrix} \frac{(x+z)^2}{2}&\frac{yz-x^2-xy-xz}{2}\\ \frac{yz-x^2-xy-xz}{2}&\frac{(x+y)^2}{2} \end{vmatrix}&=&\frac{(x+z)^2}{2}\begin{vmatrix} 1&\frac{yz-x^2-xy-xz}{(x+z)^2}\\ \frac{yz-x^2-xy-xz}{2}&\frac{(x+y)^2}{2} \end{vmatrix}\\ &=&\frac{(x+z)^2}{2}\begin{vmatrix} 1&\frac{yz-x^2-xy-xz}{(x+z)^2}\\ 0&\frac{(x+y)^2}{2}-\frac{(yz-x^2-xy-xz)^2}{2(x+z)^2} \end{vmatrix}\\ &=&\frac{(x+z)^2}{2}\begin{vmatrix} 1&\frac{yz-x^2-xy-xz}{(x+z)^2}\\ 0&\frac{(x+y)^2(x+z)^2-(yz-x^2-xy-xz)^2}{2(x+z)^2} \end{vmatrix}\\ &=&\frac{(x+z)^2}{2}\begin{vmatrix} 1&\frac{yz-x^2-xy-xz}{(x+z)^2}\\ 0&\frac{xyz(x+y+z)}{2(x+z)^2} \end{vmatrix}\\ &=&xyz(x+y+z)\begin{vmatrix} 1&\frac{yz-x^2-xy-xz}{(x+z)^2}\\ 0&1 \end{vmatrix}. \end{eqnarray}

0
On

Let $s=x+y+z$, then $$ \begin{align} &\det\begin{bmatrix} (y+z)^2&xy&zx\\ xy&(z+x)^2&yz\\ zx&yz&(x+y)^2 \end{bmatrix}\\[9pt] &=\det\begin{bmatrix} s(y+z)&xy&zx\\ s(z+x)&(z+x)^2&yz\\ s(x+y)&yz&(x+y)^2 \end{bmatrix}\tag1\\[9pt] &=\det\begin{bmatrix} 2s^2&s(z+x)&s(x+y)\\ s(z+x)&(z+x)^2&yz\\ s(x+y)&yz&(x+y)^2 \end{bmatrix}\tag2\\[9pt] &=s^2\det\begin{bmatrix} 2&z+x&x+y\\ z+x&(z+x)^2&yz\\ x+y&yz&(x+y)^2 \end{bmatrix}\tag3\\[9pt] &=s^2\det\begin{bmatrix} 2&-(z+x)&-(x+y)\\ z+x&0&yz-(x+y)(z+x)\\ x+y&yz-(x+y)(z+x)&0 \end{bmatrix}\tag4\\[9pt] &=s^2\det\begin{bmatrix} 2&-(z+x)&-(x+y)\\ z+x&0&-sx\\ x+y&-sx&0 \end{bmatrix}\tag5\\[9pt] &=s^3\det\begin{bmatrix} 2s&z+x&x+y\\ z+x&0&x\\ x+y&x&0 \end{bmatrix}\tag6\\[9pt] &=s^3\det\begin{bmatrix} y+z&z+x&x+y\\ z&0&x\\ y&x&0 \end{bmatrix}\tag7\\[9pt] &=s^3\det\begin{bmatrix} 0&z&y\\ z&0&x\\ y&x&0 \end{bmatrix}\tag8\\[18pt] &=2xyz(x+y+z)^3\tag9 \end{align} $$ Explanation:
$(1)$: add columns $2$ and $3$ to column $1$
$(2)$: add rows $2$ and $3$ to row $1$
$(3)$: factor $s$ out of row $1$ and then out of column $1$
$(4)$: subtract $z+x$ times column $1$ from column $2$
$\phantom{(4)\text{:}}$ subtract $x+y$ times column $1$ from column $3$
$(5)$: $yz-(x+y)(z+x)=-sx$
$(6)$: factor $-s$ out of columns $2$ and $3$
$\phantom{(6)\text{:}}$ distribute one factor of $s$ over row $1$
$(7)$: subtract columns $2$ and $3$ from column $1$
$(8)$: subtract rows $2$ and $3$ from row $1$
$(9)$: the determinant is now simple to compute