$$\begin{vmatrix} 0 & 1 & 1 & & 1\\ 1 & a_1 & & & \\ 1 & & a_2 & & \\ \vdots & & & \ddots & \\ 1 & & & & a_n \end{vmatrix}$$ If I suppose $a_1,\dots ,a_n\ne0$, I know how to do it, multiply the $i$-th column by $-1/a_i$ and add to the first column. Doing that I get the correct solution $ -a_1a_2 ...a_n(1/a_1+...+1/a_n)$.
How to avoid the zero case ?
It suffices to note that your determinant must be a polynomial on $a_1,\dots,a_n$. The only polynomial that is equal to $-a_1a_2 ...a_n(1/a_1+...+1/a_n)$ whenever $a_1,\dots,a_n$ are all non-zero is $$ p(a_1,\dots,a_n) = -\sum_{i=1}^n \prod_{j \neq i} a_j, $$ which is probably what you suspected. So, the determinant of your matrix must be equal to $p(a_1,\dots,a_n)$.