I need to prove the following, with induction to every $1 \leq n$: $$D(a_1,...,a_n) = \left| \begin{array}{ccc} a_1+x& a_2 & a_3 & \cdots & a_n \\ a_1& a_2+x & a_3 & \cdots & a_n \\ a_1& a_2 & a_3+x & \cdots & a_n \\ \vdots & \vdots & \vdots & & \vdots \\ a_1& a_2 & a_3 & \cdots & a_n + x \end{array} \right| = x^n + (a_1 + \cdots + a_n)x^{n-1}$$
I played with it a bit and couldn't find a way to prove it.
This is what I did: I assumed that it is correct for $n$, and tried to solve it for $n+1$
$$D(a_1, \ldots , a_n, a_{n+1}) = \left| \begin{array}{ccc} a_1+x& a_2 & a_3 & \cdots & a_{n+1} \\ a_1& a_2+x & a_3 & \cdots & a_{n+1} \\ a_1& a_2 & a_3+x & \cdots & a_{n+1} \\ \vdots & \vdots & \vdots & & \vdots \\ a_1& a_2 & a_3 & \cdots & a_{n+1} + x \end{array} \right| $$
and I did the following operation on the determinant ($R_{n+1} \to R_{n+1} - R_1$) and got:
$$ \left| \begin{array}{ccc} a_1+x& a_2 & a_3 & \cdots & a_{n+1} \\ a_1& a_2+x & a_3 & \cdots & a_{n+1} \\ a_1& a_2 & a_3+x & \cdots & a_{n+1} \\ \vdots & \vdots & \vdots & & \vdots \\ -x& 0 & \cdots & 0 & x \end{array} \right| $$
And I wasn't sure on how to proceed from here, or even if I'm on the right path.
Developing with respect to the last row, after performing those elementary row operations (that don't change the determinant), you get $$ D(a_1,\dots,a_n,a_{n+1})=\\ xD(a_1,\dots,a_n)+(-1)^{(n+1)+1}(-x)\det\begin{bmatrix} a_2 & a_3 & \dots & a_n & a_{n+1} \\ a_2+x & a_3 & \dots & a_n & a_{n+1} \\ a_2 & a_3+x & \dots & a_n & a_{n+1} \\ \vdots & \vdots & \ddots & \vdots & \vdots\\ a_2 & a_3 & \dots & a_n+x & a_{n+1} \end{bmatrix} $$ Doing $n-1$ row swaps, the determinant we need is \begin{multline} \det\begin{bmatrix} a_2+x & a_3 & \dots & a_n & a_{n+1} \\ a_2 & a_3+x & \dots & a_n & a_{n+1} \\ \vdots & \vdots & \ddots & \vdots & \vdots\\ a_2 & a_3 & \dots & a_n+x & a_{n+1} \\ a_2 & a_3 & \dots & a_n & a_{n+1} \end{bmatrix}=\\ \det\begin{bmatrix} a_2+x & a_3 & \dots & a_n & a_{n+1} \\ a_2 & a_3+x & \dots & a_n & a_{n+1} \\ \vdots & \vdots & \ddots & \vdots & \vdots\\ a_2 & a_3 & \dots & a_n+x & a_{n+1} \\ a_2 & a_3 & \dots & a_n & a_{n+1}+x \end{bmatrix}-\\ \det\begin{bmatrix} a_2+x & a_3 & \dots & a_n & 0 \\ a_2 & a_3+x & \dots & a_n & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots\\ a_2 & a_3 & \dots & a_n+x & 0 \\ a_2 & a_3 & \dots & a_n & -x \end{bmatrix}=\\[6px] D(a_2,\dots,a_{n+1})-xD(a_2,\dots,a_n) \end{multline} Therefore $$ D(a_1,\dots,a_n,a_{n+1})= xD(a_1,\dots,a_n)+ x(D(a_2,\dots,a_{n+1})-xD(a_2,\dots,a_n)) $$ By the induction hypothesis, \begin{multline} xD(a_1,\dots,a_n)+ x(D(a_2,\dots,a_{n+1})-xD(a_2,\dots,a_n))=\\ x(x^n+(a_1+\dots+a_n)x^{n-1})+\\ \qquad x(x^n+(a_2+\dots+a_{n+1})x^{n-1}-x^n-(a_2+\dots+a_n)x^{n-1})=\\ x^{n+1}+(a_1+\dots+a_n+a_{n+1})x^n \end{multline}
Note that you have to check the induction basis for $n=1$ and $n=2$, which is easy.