I am looking for some kind of "general solution" for the following definite integral:
$$\int^1_0 f'(x) \frac{a-(1-a)\cdot \exp(c\cdot f(x))}{a+(1-a) \cdot \exp(c\cdot f(x))} dx$$
Furthermore we know that $f(0)=0$ and $f(1)=\frac{\log(\frac{a}{1-a})}{c} > 0$ and $a>0.5$.
Question 1: Is it possible to prove or disprove that this integral will only be dependent on $a$ and $c$, does this impose any limits on the function $f(x)$?
Question 2: Is there a connection to conservative vector fields?
I think you're looking for The Fundamental Theorem of Calculus and The Fundamental Theorem of Line Integrals, also known as The Gradient Theorem.
The first implies the definite integral will have a defined value under some circumstances, what those are, and what features can effect the value of the integral.
The second answers Question two in the affirmative.
These are examples of many theorems in which its proven that the value of an integral can be determined by evaluating related functions at the boundaries of integration, Stoke's Law, Gauss' Law, etc.