I have a n by n matrix $A$ is defined as follows:
$$ A_1 = \begin{bmatrix} a_0 \end{bmatrix}, A_2 = \begin{bmatrix} a_0 & a_1 \\\ a_1 & a_0 \end{bmatrix},..., A_{n+1} = \begin{bmatrix} a_0 & a_1 & a_2 & \cdots & a_n \\\ a_1 & a_0 & a_1 & \cdots & a_{n-1} \\\ a_2 & a_1 & a_0 & \cdots & a_{n-2} \\\ \vdots & \vdots & \vdots & \ddots & \vdots \\\ a_n & a_{n-1} & a_{n-2} & \cdots & a_0 \end{bmatrix} $$
For all n, $A_n$ is positive definite and symmetric. Now I would like to show that the inequality holds:
$$ det A_{n+1} \le \dfrac{(det A_n)^2}{detA_{n-1}} $$
From the positive definite property, I am sure that all the determinants are all positive based on the Jacobi determinant. Also, the eigenvalues of $A_n$ are all positive in this case. Yet, I have no idea about how to use those properties to prove the listed inequality. Where should I start from?