Let $I \subset \mathbb R$ be open, $u \in \mathcal D'(I)$ be a distribution whose distributional derivatives vanishes (i.e. is zero for all test functions, which we may assume to be complex valued ).
We show $\forall c \in \mathbb C: \forall \phi \in \mathcal D(I) : u(\phi) = \int c\cdot\phi dx$. (EDIT: Correctly, $c$ should be quantified with $\exists$. My question has been why the following proof doesn't allow for arbitrary complex $c$, which explains the preceding statement.)
Proof:
Let $\Psi \in D(I)$. $\Psi$ is the derivative of a test function iff $\int \phi dx = 0$. In that case $u(\Psi) = 0$.
Let $h \in D(I)$ be arbitrary with $\int h dx = 1$. Now for every test function $\phi \in D(I)$ we see:
$\phi - \int \phi dx \cdot h \in D(I)$ and $\int ( \phi - \int \phi dx h ) dx = 0$.
therefore
$u( \phi - \int \phi dx h ) = 0$, i.e. $u(\phi) = u(h) \int \phi dx$
$\square$
If this is not wrong, how can I interpret the fact that $h$ has been arbitrary?
Note: This is part of a larger proof, which shows the same for non-one-dimensional domains.
The function $h$ is not arbitrary; it is arbitrary with respect to the condition that $\int h dx = 1$. (And this latter condition is certainly necessary for the proof to go through.)
The proof given (which seems correct to me) shows that $u(\phi)$ only depends on the value of $\int \phi$. In particular, one sees (after the proof is done) that $u(h)$ only depends on the value of $\int h dx$, which was fixed to be $1$. Thus $u(h)$ is independ of the choice of $h$ (as long as $\int h dx = 1$), and is equal to the constant $c$ in the statement of the theorem.
[Added: As Theo points out in his answer, you have an incorrect universal quantifier on the constant $c$ in the statement of the theorem; it should read for some $c$. I didn't notice this when I read the question!]