Let $a_{ij}=a_ia_j$ $1\leq i,j\leq n$ where $a_1,... a_n$ are real numbers. Let $A=(a_{ij})$ be the $n \times n $ matrix . Then
It is spoosible to choose $a_1,... a_n$ so as to mke A non singular.
The matrix is positive definite if $(a_1,...,a_n)$ is a nonzero vector.
The matrix A is positive definite for all $(a_1,...,a_n)$.
For all $(a_1,...,a_n)$ zero is an eigen value of A.
My attempt:
Option 1 is false because if we choose $a_1=\frac{1}{\sqrt{2}} , a_2=\sqrt{2}$ then $2\times 2$ matrix $A$ is singular.
Option 2 is also false because if we take $a_1=1 \ a_2=2 $ then $2\times2$ matrix $A$ is not positive definite. Option 3 is also false. But I am not getting option 4. Could you please help me? Thanks in advance.
With these sorts of questions, it's good to write the matrix in a way that helps us think geometrically. So let us define $$\mathbf M = \begin{bmatrix} a_1 a_1 & \dots & a_1 a_n \\ \vdots & \ddots & \vdots \\ a_n a_1 & \dots & a_n a_n \end{bmatrix}, \ \ \ \ \ \mathbf a=\begin{bmatrix} a_1 \\ \vdots \\ a_n \end{bmatrix}. $$ I hope you can show that $$ \mathbf M = \mathbf a \mathbf a^{\rm T},$$ where $^T$ denote taking a transpose.
Therefore, if $\mathbf x \in \mathbb R^n$ is any vector, we have $$ \mathbf M \mathbf x = \mathbf a \mathbf a^{\rm T} \mathbf x = (\mathbf x . \mathbf a) \mathbf a,$$ where $.$ denotes taking the dot product. I encourage you to check this too.
The advantage of writing the matrix in this way is that taking dots products has some geometric meaning, and many of us have better intuition about shapes than about algebra.
Using your geometric understanding of dot products, it should be possible for you to verify that:
$|\mathbf a |^2$ is an eigenvalue of $\mathbf M$. The corresponding eigenspace is the one-dimensional space of vectors $\mathbf x$ that are parallel to $\mathbf a$.
$0$ is an eigenvalue of $\mathbf M$. The corresponding eigenspace is the $n-1$-dimensional space of vectors $\mathbf x$ that are perpendicular to $\mathbf a$.
I particularly encourage you to think about the case where $\mathbf a$ is a unit vector: here, you should be able to convince yourself that $\mathbf M = \mathbf a \mathbf a^{\rm T}$ represents the projection along the axis of $\mathbf a$. You can think about what the eigenvalues and eigenvectors mean geometrically in this case.