Baby Rudin Theorem 11.35

214 Views Asked by At

Inside Principles of Mathematical Analysis, Rudin states

11.35 $\ \ $ Theorem $\ \ $ Suppose $f\in{L}^2(\mu)$ and $g\in{L}^2(\mu)$. Then $fg \in{L}^1(\mu)$, and

$$\int_{X} \left|fg\right| d\mu \leq \|f\| \|g\|.$$

Proof: This is the Schwarz inequality, which we have already encountered for series and for Riemann integrals. It follows from the inequality

$$0\leq \int_{X}(\left|f\right|+\lambda\left|g\right|)^2 d\mu=\|f\|^2+2\lambda\int_{X} \left|fg\right| d\mu+\lambda^2\|g\|^2$$

which holds for every real $\lambda$.

One way to interpret the last equality on the RHS is through the discriminant of a quadratic equation for $\lambda$

$$\lambda*=\frac{-b\pm\sqrt{b^2-4ac\ }}{2a},$$

where $a,b,c$ are the coefficients of $\lambda$. If $\lambda$ is real, then the discriminant $b^2-4ac$ should be positive. This means that

$$b^2-4ac \geq 0.$$

When I do the analysis on pencil and paper, I find that

$$b^2 \geq 4ac,$$

and that

$$\int_{X} \left|fg\right| d\mu \geq \|f\| \|g\|.$$

This is clearly wrong. Why am I getting the opposite of the Schwarz inequality?

1

There are 1 best solutions below

5
On BEST ANSWER

The quadratic $P(\lambda)=\|f\|^2+2\lambda\int_{X} \left|fg\right| d\mu+\lambda^2\|g\|^2$ is always positive, so there can't be two distinct roots (otherwise $P(\lambda)$ would be negative for some values of $\lambda$). Thus the discriminant must be $\leq0$.