Original question Let $P_1,\ldots, P_n$ be distinct polynomials of a complex variable. We suppose that they are without constant term ($P_i(0)=0$). Is it true that the functions $z\mapsto e^{P_i(z)}$ are linearly independant (over $\mathbb{C}$) ?
Late edit: The proof given in the selected answer proves the following
Theorem Let $(P_i)_{i\in I}$ a family of complex univarariate polynomials without constant term. Then if $i\mapsto P_i$ is injective, $(e^{P_i})_{i\in I}$ is a family of (entire) functions linearly independent over $\mathbb{C}[z]$.
One then has the
Corollary Under the same conditions (no constant term), if the family $(P_i)_{i\in I}$ is $\mathbb{Z}$-independant, then $(e^{P_i})_{i\in I}$ is algebraically independant with respect to $\mathbb{C}[z]$.
Proof Call $G$ the family of exponentials ($G=(e^{P_i})_{i\in I}$). For every multiindex $\alpha \in \mathbb{N}^{(I)}$, one has $$ G^\alpha=\prod_{i\in I} (e^{P_i})^{\alpha(i)}= \prod_{i\in I} (e^{\alpha(i)\,P_i})= e^{\sum_{i\in I}\alpha(i)\,P_i} $$ but the fact that $(P_i)_{i\in I}$ is $\mathbb{Z}$-independant (linearly) implies (and is indeed equivalent to) $\alpha \mapsto \sum_{i\in I}\alpha(i)\,P_i$ is into. One has, from the theorem, that $(G^\alpha)_{\alpha \in \mathbb{N}^{(I)}}$ is $\mathbb{C}[z]$-linearly independant which amounts to say that $G$ is algebraically independant over $\mathbb{C}[z]$.
As mentioned in the comments, this has been answered affirmative in linear independence of exponentials on Math Overflow for the more general case of exponentials of entire functions of several complex variables.
In the case of exponentials of polynomials of a single complex variable it can be proven with more elementary methods. Here is something I came up with. The idea is to consider slightly more general linear combinations with rational coefficients, in order to make a proof by induction work.
In particular, $e^{P_1}, \ldots, e^{P_n}$ are linearly independent over $\Bbb C$.
Proof by induction:
The case $n=1$ is trivial, $R_1 e^{P_1} = 0$ clearly implies $R_1 = 0$.
Now let $n \ge 2 $ be arbitrary and assume that the claim is true for all smaller values of $n$. Let $P_1, \ldots, P_n$ be distinct polynomials without constant term, and $R_1, \ldots, R_n$ be rational functions, with $$ R_1 e^{P_1} + \ldots + R_n e^{P_n} = 0 \, . $$ If all $R_k$ are zero then we are done. Otherwise (without loss of generality) $R_n \ne 0$, and it follows that $$ \frac{R_1}{R_n}e^{P_1 - P_n} + \ldots + \frac{R_{n-1}}{R_n}e^{P_{n-1} - P_n} + 1 = 0 \, . $$ Differentiating this identity gives $$ \left( \left(\frac{R_1}{R_n}\right)' + \frac{R_1}{R_n}(P_1'-P_n')\right)e^{P_1 - P_n} + \ldots + \left( \left(\frac{R_{n-1}}{R_n}\right)' + \frac{R_{n-1}}{R_n}(P_{n-1}'-P_n')\right)e^{P_{n-1} - P_n} = 0\, . $$ Now we can apply the induction hypotheses, since the $P_k - P_n$ ($k=1,\ldots, n-1$) are distinct polynomials without constant term. It follows that $$ \left(\frac{R_k}{R_n}\right)' + \frac{R_k}{R_n}(P_k'-P_n') = 0 $$ for $k=1,\ldots, n-1$, and consequently that the functions $\frac{R_k}{R_n}e^{P_k - P_n}$ are constant: $$ \frac{R_k}{R_n}e^{P_k - P_n} = C_k \in \Bbb C \quad (k = 1, \ldots, n-1) \, . $$
If $C_k \ne 0$ for some $k$ then $\frac{R_k}{R_n}$ is a rational function without zeros and poles and therefore constant. It follows that $e^{P_k - P_n}$ and consequently $P_k - P_n$ is constant, which is a contradiction to the assumption that the polynomials are distinct without constant term.
Therefore $C_k=0$ for $k=1,\ldots, n-1$, so that $R_k = 0$ for $k=1,\ldots, n-1$, which also implies that $R_n = 0$.
Remark: Instead of $P_k(0)=0$ for all $k$ it suffices to assume that no difference $P_k - P_j$ for $k \ne j$ is constant. The proof remains the same.