Can someone please help me complete my proof for the following? I am a bit confused. Thank you so much.
$\def\R{{\mathbb R}}$
We have some real-valued function $f\colon A \to \R$ where $A\subseteq \R^n$ is a convex subset, is concave if, for any $x,x'\in A$ and $\alpha \in [0,1]$, $$f((1-\alpha)x + \alpha x') \ge (1-\alpha)f(x) + \alpha f(x').$$ Moreover, $f$ is quasi-concave if, for any $x, x' \in A$ and $\alpha \in [0,1]$, $$f((1-\alpha)x + \alpha x')\ge \min\{f(x),f(x')\}.$$
i. Prove directly that $f(x) = x^2$, where $x\in \R_+$, is quasi-concave but not concave.
If $ 0 \leq x \leq x'$, then for all $\alpha \in [0,1]$, one has $(1-\alpha)x + \alpha x' \geq x$. Because $f : x \mapsto x^2$ is increasing on $\mathbb{R}_+$, then $$f((1-\alpha)x + \alpha x') \geq f(x) = \min \lbrace f(x), f(x') \rbrace$$
The function, however, is not concave, but convex. Indeed for all $x \in \mathbb{R}$, $f''(x) \geq 0$.