In every analysis course, one learns some integrals of heavily used function -- e. g. polynomials, exp, sin etc. -- with respect to (w. r. t.) the Lebesgue measure.
Is there anything similar for the Lebesgue integral w. r. t. an arbitrary (finite) measure? I do know the Integral Operator's linearity and the dominated convergence theorem (and Convolution of two probability distributions and other stuff from a measure theory lecture).
In other words, is there something like this, perhaps in form of a table, for often used functions like polynomials, exp, sin etc. (restricted to an interval $[a, b]$ with appropriate scaling factor) and often used measures like Angle Measure, Gaussian Measure?
This may be a broad question, but nevertheless, it's interesting. I'm sure we learnt a theorem to calculate integrals w. r. t. probability measures by transforming it to an integral w. r. t. the Lebesgue measure (in a lecture foregoing measure theory), but I do not know the concise english word for this operation. It may be that it's only a matter of scaling, the described operation and perhaps a generalization to $\sigma$-finite measures by decomposing into countable many subsets on which the measure is finite (Is there a technical term for this procedure?).
Many Thanks in advance.