Let $p$ and $q$ be positive reals such that $\frac{1}{p}+\frac{1}{q} = 1$, so that $p,q$ in $(1,\infty)$.
For $\vec a$ and $\vec b \in \mathbb{R}^2$ prove that $|\vec a \cdot \vec b | \leq ||\vec a||_p|| \vec b||_q$.
A hint was posted for using Jensen's inequality to use $\phi(x) = ln(1 + e^x)$. But I don't know how I'd work that in.
Here's first proving an easier version: Note $\phi(x) = -\log x $ is convex, on $x > 0,$ and hence convexity (= Jensen) yields $$ -\log(tx + (1-t)y) \leq -t\log x - (1-t)\log y, $$ let $x = u^p, y = v^q,$ and $t = 1/p,$ where $u,v > 0.$ You then easily get, $$ uv \leq \frac{u^p}{p} + \frac{v^q}{q}. $$
Now, if $\|a\|_p = \|b\|_q = 1,$ then we see $$ |\sum_{i=1}^n a_i b_i| \leq \sum_{i=1}^n |a_i||b_i| \leq \sum_{i=1}^n \frac{|a_i|^p}{p} + \sum_{i=1}^n \frac{|b_i|^q}{q} = 1. $$ For general vectors, just normalize.