Can we get a tensor out of summation of one vector with its transpose?

476 Views Asked by At

I don't know much about tensor calculus and here is something I'm trying to figure out.

$$T=\mu({\nabla}\vec{V}+{\nabla}\vec{V}^T)$$

T is viscous stress tensor and $\vec{V}$ is the velocity vector. How do we get a tensor of rank 2 by adding a vector (gradient of velocity) to its transpose? My best guess is that the gradient of a vector is a 2nd rank tensor (though we are told in engineering schools that the gradient is only defined for scalar fields). Am I right? Am I missing something here?

Thanks.

2

There are 2 best solutions below

1
On

If $\mathbf{v}=(v_1,v_2v_3)$ is a vector, its gradient is the matrix:

$$ \nabla \mathbf{v}= \begin{bmatrix} \frac{\partial v_1}{\partial x_1}&\frac{\partial v_1}{\partial x_2}&\frac{\partial v_1}{\partial x_3}\\ \frac{\partial v_2}{\partial x_1}&\frac{\partial v_2}{\partial x_2}&\frac{\partial v_2}{\partial x_3}\\ \frac{\partial v_3}{\partial x_1}&\frac{\partial v_3}{\partial x_2}&\frac{\partial v_3}{\partial x_3}\\ \end{bmatrix} $$ that is the so called Jacobian matrix.

Note that the elements in the $i-$row of this matrix are the components of the gradient of the $i-$componet if the vector $\mathbf{v}$.

This matrix contains all the informations about the first order changes of the vector $\mathbf{v}$ in any direction, and it transforms as a covariant tensor for linear transformation of the coordinates. But note that it is not covariant for curvilinear changes of coordinates, so it is not properly a tensor.

For a simple introduction to tensor and covariant derivative you can see here.

2
On

To take it a bit further than Emilio and to answer your question: $\vec{\nabla}\vec{v}$ and its transpose are already rank 2 tensors. Scalars are rank 0, vectors are rank 1 and each subsequent addition of a 'component' increases the rank by one.

The result of the addition of two rank 2 tensors is still a rank 2 tensor: $$\vec{\nabla}\vec{v} + \left(\vec{\nabla}\vec{v}\right)^T = \left[\begin{array}{ccc} \partial_{x}u & \partial_{y}u & \partial_{z}u\\ \partial_{x}v & \partial_{y}v & \partial_{z}v\\ \partial_{x}w & \partial_{y}w & \partial_{z}w \end{array}\right] + \left[\begin{array}{ccc} \partial_{x}u & \partial_{x}v & \partial_{x}w\\ \partial_{y}u & \partial_{y}v & \partial_{y}w\\ \partial_{z}u & \partial_{z}v & \partial_{z}w \end{array}\right] = \left[\begin{array}{ccc} 2\partial_{x}u & \partial_{y}u+\partial_{x}v & \partial_{z}u+\partial_{x}w\\ \partial_{x}v+\partial_{y}u & 2\partial_{y}v & \partial_{z}v+\partial_{y}w\\ \partial_{x}w+\partial_{z}u & \partial_{y}w+\partial_{z}v & 2\partial_{z}w \end{array}\right] $$

Clearly the number of 'components' are preserved and the result is a rank 2 tensor.