Consider a sequence of mean-zero independent random variables $\{X_i\}_{i=1}^\infty$ and the partial sum $S(n)=\sum_{i=1}^n X_i$. The Hajek-Renyi inequality provides a upper bound of the normed partial sum: $$\mathbb{P}\left(\max_{n\le t \le m} c_t \lvert S(t) \rvert \ge \epsilon \right) \le \frac{1}{\epsilon^2} \left(c_n^2 \sum_{k=1}^n D_k^2 + \sum_{k=n+1}^m c_k^2 D_k^2\right),$$ where $D_k^2 = \mathrm{Var}(X_k)$ and $\{c_k\}$ is a positive non-increasing sequence.
If we choose $c_k = \frac{1}{\sqrt{k}}$ and $D_k^2 = \sigma^2$, $$\mathbb{P}\left(\max_{n\le t \le m} \frac{1}{\sqrt{t}} \lvert S(t) \rvert \ge \epsilon \right) \le \frac{1}{\epsilon^2} \left(\sigma^2 + \sum_{k=n+1}^m \sigma^2/k\right) \approx \frac{1}{\epsilon^2} \left(\sigma^2 + (\log(m) - \log(n)) \sigma^2\right) \le \frac{[1+\log(m)] \sigma^2}{\epsilon^2}.$$
I am now considering a similar setting where the maximum is taken at two indexes. Is the following probability $$\mathbb{P}\left(\max_{n\le t < k \le m} \frac{1}{\sqrt{k - t}} \lvert S(k) - S(t) \rvert \ge \epsilon \right),$$ also preserves a $\log(m)$-ordered upper bound?
Or more generally, with general sequence $c_k$, I am interested in the upper bound of $$\mathbb{P}\left(\max_{n\le t < k \le m} c_{k - t} \lvert S(k) - S(t) \rvert \ge \epsilon \right).$$
Update: Solution draft
I have made some effort to this question for $c_k=1/\sqrt{k}$. Firstly we may reformulate the problem as $$\mathbb{P}\left(\max_{1\le t < k \le m} \frac{1}{\sqrt{k - t}} \lvert S(k) - S(t) \rvert \ge \epsilon \right) = \mathrm{O}(\log(m)).$$
Then divide the maximand into two parts: $$\max_{1\le t < k \le m} \frac{1}{\sqrt{k - t}} \lvert S(k) - S(t) \rvert \le \max_{a_m < t + a_m < k \le m} \frac{1}{\sqrt{k - t}} \lvert S(k) - S(t) \rvert + \max_{1\le t < k \le m \& k \le t + a_m} \frac{1}{\sqrt{k - t}} \lvert S(k) - S(t) \rvert.$$
For sufficient large $a_m$, e.g. $a_m = \mathrm{O}(\log(m))$, we can treat the first part as gaussian (by Berry-Essen bound). But what about the second part?