I think, this question, probably was asked many times, but I'm trying to understand this moment.
We have a function $f$ and to find the derivative, we have a lot of rules, etc, but we have a strict definition - $f'(x)=\lim_{\Delta x\to 0} \frac{f(x+\Delta x)-f(x)}{\Delta x}$. It's pretty simple and clear, it shows that $f'(x)$ just equals the stuff in the right side, and you can choose, which way is better for you - solve by rules, schemes, or solve by definition.
When we deals with integral, here goes different types, kinds of it: indefinite, definite, surface etc. For each of them there are rules, algorithms how to solve them. But is there a way to solve integral as in derivative case?
Let's talk about the definite and the indefinite integral.
Another name for the indefinite integral is the "anti-derivative". This name makes much more sense to me; it's clear what we're looking for. If we want the anti-derivative of $f$, then we're looking for a function $F$ such that $F'=f$. We have a special symbol for $F$. Here it is: $$F(x) = \int f(x) \, dx$$. This squiggly line (that we call the integral sign) is just a special symbol that helps us remember which function we're finding the anti-derivate of.
Now, how about the definite integral? This is a different beast entirely. The definition of the definite integral (for functions that map the real numbers to the real numbers) is as follows: $$\int_a^b f(x) \, dx = \lim_{N\rightarrow\infty} \sum_{i=1}^N f(x_i) \Delta x_i.$$
That is, we're taking the interval [a,b] and dividing up into $N$ pieces, and then summing $N$ values of the function evaluated somewhere in each piece times the width of each piece. And then we take the limit of those sums as the number of pieces goes to infinity.