Let us consider a simple convex optimization problem \begin{align} \operatorname{minimize}_{x} & \quad \theta(x) + f(x), \tag{$\heartsuit$} \end{align} where both $\theta(x) $ and $f(x)$ are convex functions; $f(x)$ is smooth but $\theta(x)$ is not necessarily smooth.
For solving $(\heartsuit)$, the $k$-th iteration of the proximal point algorithm begins with a given $x^k$, and renders a new iterate $x^{k+1}$ via the recursion \begin{align} x^{k+1} = \operatorname{argmin}_x \theta(x) + f(x) + \frac{r}{2} \| x - x^k \|_2^2. \end{align}
As shown in the literature (see, e.g., slide#9-10 in [1]), one could show the nice contraction property as \begin{align} \|x^{k+1} - x^\star\|^2 \leq \|x^{k} - x^\star\|^2 - \|x^{k} - x^{k+1}\|^2. \end{align}
Now my questions are, if one shows $\exists \alpha, 0 < \alpha \leq 1$ \begin{align} {\color{red}{\alpha}} \|x^{k+1} - x^\star\|^2 \leq \|x^{k} - x^\star\|^2 - \|x^{k} - x^{k+1}\|^2. \end{align}
Does the above not show convergence still? In my eyes, it does. Please correct me if I am wrong. Please enlighten me.
If you agree it does show convergence, then how to interpret such a contraction?
Should both terms $\|x^{k+1} - x^\star\|^2$ and $\|x^{k} - x^\star\|^2$ have equal weights? In other words, should $\alpha$ factor exist on both terms $\|x^{k+1} - x^\star\|^2$ and $\|x^{k} - x^\star\|^2$?
This inequality alone does not show convergence. It only implies boundedness of $(x_k)$ and $\|x_{k+1}-x_k\|\to0$. Note that a constant sequence $x_k=x^0\ne x^*$ also satisfies the inequality.
If the inequality is fulfilled only for some $\alpha\in (0,1)$ then not much can be said: $x^*=0$ and $x_k = \left(\frac{2}{1+\alpha}\right)^k$ satisfy the inequality but $(x_k)$ is unbounded.