I'm reading a paper where in the convergence analysis of the algorithm, they make the assumption that the gradient of the function is Lipschitz continuous with constant $L > 0$. Then, in the analysis, a hyperparameter $\delta > 0$ is used to update some of the variables. To guarantee that the algorithm converges, a condition of the following type is needed $\delta < L^2$, which leads to $L$ being greater than a certain constant $C$.
Since I'm a novice in this area, is it okay to have conditions on the Lipschitz constant of the type $L > C$? Does this means that the algorithm converges for a specific set of functions?
Note that if a function is Lipschitz continuous with constant $L$ then it is Lipschitz continuous with any constant $L' \geq L$. Making the constant bigger is no issue - making is smaller is where you might run into problems. The condition $L^2>\delta$ should probably be interpreted as a constraint on $\delta$, not on $L$. It's normal in optimization for the Lipschitz constant of the gradient to play an important role in the design or analysis or optimization algorithms.