I have the following question:
It's true that $$ \frac{\alpha}{\alpha + \beta}\|u\|^2 + \frac{\beta}{\alpha + \beta} \|v\|^2 > \langle u, v\rangle $$
for all $u,v \in \mathbb R^N $ with $\{u, v\} $ linearly independent and all $\alpha, \beta \in (0, \infty)$?
where $\langle \cdot, \cdot \rangle $ is the usual inner product in $\mathbb R^N$ and $\|\cdot\|$ is the euclidean norm.
In the particular case that $\alpha = \beta = \frac{1}{2}$ is easy. If $\{u, v\} $ linearly independent then by the Hölder inequality $\frac{1}{2}\|u\|^2 + \frac{1}{2}\|v\|^2 > \|u\|\cdot\|v\| \geq |\langle u, v\rangle| \geq \langle u, v\rangle $. but in the general case I do not know how to do.
Unfortunately that is not true. What are you asking is equivalent to $$\begin{align} \frac{\alpha}{\alpha + \beta}\|u\|^2 + \frac{\beta}{\alpha + \beta} \|v\|^2 &> \langle u, v\rangle \\ \alpha\|u\|^2 + \beta \|v\|^2 &> (\alpha + \beta)\langle u, v\rangle \\ \alpha\langle u, u\rangle - (\alpha + \beta)\langle u, v\rangle + \beta \langle v, v\rangle &> 0 \\ \langle \alpha u - \beta v, u - v\rangle &> 0 \end{align}$$
Now since $u$ and $v$ are arbitrary we can drop the minus sign, thus obtaining $$ \langle \alpha u + \beta v, u + v\rangle > 0 $$
To see that the above inequality does not hold, in $\mathbb{R}^2$, take $u = (2, -\frac{1}{2})$, $v = (-1, \frac{1}{2})$, $\alpha = 1$ and $\beta = 3$. Hence $ \alpha u + \beta v = (-1, 1)$ and $u + v = (1,0)$, which clearly does not satisfy the inequality.