This is the statement
Consider the inner product space $(\mathbb{R}^n, \left \langle \cdot ,\cdot \right \rangle) $ over $\mathbb{R}$. Let $A \in M_{n\times n}(\mathbb{R } ) $. Let's define $$\begin{matrix} a_A:&\mathbb{R}^n \times \mathbb{R}^n & \rightarrow & \mathbb{R} \\ &(x,y) & \mapsto & x^{T}Ax \end{matrix}$$
The goal is to show that if for all $x \in \mathbb{R}^n$ we have $x^T A x > 0$, then $a_A$ is bilinear and symmetric, and furthermore there is an $\alpha>0$ such that $\forall x\in \mathbb{R}^n$ we have $$\alpha a_{I}(x,x)\equiv \alpha\left \langle x,x \right \rangle \equiv \alpha\left \| x\right \|^2\leq a_A(x,x)$$
Proof: not to write so much, I have already demonstrated that it is bilinear and symmetrical. for the other party:this is what I have, this is more or less the idea:
let's define $$\alpha :=min\left \{ a_{kk} /k=1,...,n\right \};$$ note that
$$\alpha \left \langle x,x \right \rangle \equiv \alpha a_{I}(x,x)=\alpha (xIx)=\alpha \sum_{k=1}^{n}x_{k}^{2}= \sum_{k=1}^{n}\alpha x_{k}^{2}\leq \sum_{k=1}^{n}a_{kk} x_{k}^{2}+ \text{(REMAINING AMOUNTS) }= \sum_{i=1}^{n}\sum_{j=1}^{n}x_{i}a_{ij}x_{j}=x^T Ax=a_A(x,x)$$
My question is as follows:
I have not yet been able to prove that alpha is positive.
I have tried but I don't know how to proceed, I would be very grateful if you could tell me how to proceed....
First, you don't appear to assume that $A$ is symmetric, but you need this to prove that $\alpha_A$ is symmetric, so I will assume that you forgot to write it.
The condition $\forall x\in\mathbb{R}^n\setminus\{0\} : x^tAx > 0$ is known as the matrix $A$ being positive definite. Positive definite (real symmetric) matrices have a Cholesky Decomposition $A = LL^t$, where $L$ is a (real) lower-triangular matrix with positive diagonal entries. You can then write that $$\alpha_A(x,x) = x^tAx = x^t(LL^t)x = \lVert L^tx\rVert_2^2 = \alpha_I(L^tx, L^tx).$$
Let $y := L^tx$, so $x = L^{-t}y$ (note that $L^t$ is invertible, as its diagonal is positive). We then have that $$\alpha_I(y, y) = \alpha_A(L^{-t}y, L^{-t}y) = \lVert L^t(L^{-t}x)\rVert_2^2 = \lVert L^{-t}(L^tx)\rVert_2^2 \leq \lVert L^{-t}\rVert_{op}^2\alpha_A(y,y).$$
Here, we use that
Therefore, we get the desired inequality, for the constant $\lVert L^{-t}\rVert_{op}^{-2}$.