I am trying to understand the proof of Theorem 37 at page 84 of the book Stochastic Integration and Differential Equations by P. Protter.
In the proof there is the following statement, referred to the semimartingale $X$:
(1) "Since $X$ has cadlag paths, there are only a finite number of $\ \ 's'$ such that $| \Delta X_s | >1/2$ on each compact interval (fixed $\omega$)"
Why is this? I know that if $X$ has cadlag path then it has countably many jumps. Is the statement related to the fact of $X$ being a semimartingale and therefore composed by a local martingale and a finite variation process?
Another implication I am not sure about is:
(2) since $\log (V_t) \leq [X,X]_t$, it follows that $\log (V_t)$ is a process of finite variation.
Is it enough to say that if a process is bounded by a finite variation one, then the it is of finite variation as well?

Proof:

Any càdlàg function $f: [0,T] \to \mathbb{R}$ has at most finitely many jumps with jump size $>\epsilon$ for any (fixed) $\epsilon>0$, see e.g. this answer.
The estimate "$\leq [X,X]_t$" is not used to conclude that $\log V_t$ is of bounded variation, but to show that the series $\sum_{s \leq t} (\log(1+U_s)-U_s)$ is absolutely convergent.
Recall that a function $\alpha$ is of bounded variation if we can write $\alpha = \alpha^+- \alpha^-$ where both $\alpha^+$ and $\alpha^-$ are increasing functions. For brevity set $Y_s := \log(1+U_s)-U_s$. If we define $$\alpha_+(t) := \sum_{\substack{s \leq t \\ Y_s>0}} Y_s \qquad \text{and} \qquad \alpha^-(t) := \sum_{\substack{s \leq t \\ Y_s<0}} - Y_s,$$ then $$\log V_t = \alpha^+(t)-\alpha^-(t)$$ and $\alpha^+$, $\alpha^-$ are both increasing processes. This means that $\log V_t$ is of bounded variation.