Using a definition similar to the wikipedia definition here:
Suppose that $(x_1, \ldots, x_n)$ are i.i.d samples from some univariate distribution with an unknown density $f$ at any given point. Then define $g(x) = \frac{1}{n} \sum_{i=1}^{n} K_h(x-x_i) = \frac{1}{nh} \sum_{i=1}^{n} K(\frac{x-x_i}{h})$. If $K$ is a standard normal distribution, I'm trying to see why $\int_{- \infty}^{ \infty} g(x) = 1$.
Expanding out (a bit messy): $$\frac{1}{nh} \int_{-\infty}^{\infty} ( \sum_{i=1}^{n} \frac{1}{ \sqrt{2\pi}} e^{\frac{-(\frac{x-x_i}{h})^2}{2}})dx = 1$$
I'm not seeing why this is true, any insights appreciated.
$$\int_{-\infty}^\infty \frac{1}{h} K\left(\frac{x-x_i}{h}\right)\,dx = \int_{-\infty}^\infty K(t) \, dt = 1$$
So by pushing the integral into the sum, $$\int_{-\infty}^\infty \left[\frac{1}{nh} \sum_{i=1}^n K\left(\frac{x-x_i}{h}\right)\,dx\right] = \frac{1}{n} \sum_{i=1}^n\int_{-\infty}^\infty \frac{1}{h} K\left(\frac{x-x_i}{h}\right)\,dx = \frac{1}{n} \sum_{i=1}^n 1 = 1.$$