Calculate de determinant of following $n \times n$ matrices?
(a) $M = (a_{ij})$ defined by $a_{ij} = i + j$,
(b) $M = (a_{ij})$ defined by $a_{ij} = x_{j}^{i-1}$.
For (a), I think in iduction over $n$, and so, apply the cofactor expansion since for $n = 3,4,5,6,7$ (and I think it goes), $\det M = 0$. For $n=2$ we have $\det M = -1$ and for $n=1$, $\det M = 2$. But I don't know if it really works.
For (b), I cannot see what to apply here. I tried some calculations with small $n$, but I don't think it helps. I appreciate any hints.
Case a
If $n=1$, $\det M = 2 \neq0$. For $n=2$, $\det M = -2 \neq0$. So suppose $n \ge 3$ and denote $u =^t(1, 2, \dots, n)$ and $v=^t(1, 1, \dots, 1)$. The columns of the matrix $M$ are the vectors $(u+v, u +2 \cdot v, \dots, u + n \cdot v)$.
We have the linear dependency between the three first column vectors $$(u+3 \cdot v) - 2(u + 2 \cdot v) +(u +v) = 0.$$
Hence $\det M = 0$ for $n \ge 3$.
Case b
This is Vandermonde determinant. See link for the determinant computation.