I need to give a counterexample to $$E(X) \cdot (1+c) < \max(X)$$ where $c > 0$ and $X$ is a nonnegative random variable.
I tried $X = 10$ with $\mathbb{P}(X = 10) = 0.99$ and $X = 0$ with $\mathbb{P}(X = 0) = 0.01$.
Then, $E(X) = 9.9$. So, $9.9\cdot(1+c) < 10$. If $c = 0.02$ the inequality is not satisfied. I have a feeling that this counterexample is not correct since, $c$ is a parameter...
Using with angryavian stated in the comments, you can show that for any possible value of $c$, that you can generate a probability distribution such that the above inequality does not hold.
Let's take your example of assigning a probability $p$ to $X = 10$ and a probability of $1 - p$ to $X = 0$. We then have that $E(X) = 10 \cdot p$ and that $Max(X) = 10$. Substituting these values into our inequality, we then get: $$ 10 \cdot p \cdot (1 + c) < 10$$
Cancelling the $10$ on both sides, we see that the inequality holds when $$p \cdot (1 + c) < 1$$
Moving the $1 + c$ over we finally end up finding that the inequality holds when $$ p < \frac{1}{1 + c}$$
Then, for any value of $c$, we can create a distribution that is a counterexample to the original inequality by assigning $P(X = 10) = \frac{1}{1 + c}$ and $P(X = 0) = 1 - \frac{1}{1 +c}$