Question: How can we prove that $$L(n)=\sum_{k=1}^n\left|\cot \sqrt2\pi k\right|=\Theta(n\log n)$$ as $n\to\infty$?
Furthermore, if $\sqrt2$ is replaced with a quadratic irrational number, does it still holds?
Numerical experiment.
By plotting $$\frac1{n\ln n}\sum_{k=1}^n\left|\cot \sqrt2\pi k\right|,$$ we can find that it approximately tends to $0.6$.
(The following graph is added after an edit)

Failed attempt of the upper bound.
$$L(n)<\sum_{k=1}^nCk=C\frac{n(n+1)}2$$for some $C$. It can be easily deduced due to the irrationality measure $2$ of $\sqrt2$.
Failed attempt of the lower bound.
Asymptotically, half of the summand is greater than $1$ due to the irrationality of $\sqrt2$. Therefore, $L(n)>Dn$ for some $D$ when $n$ is large enough.
We prove the following.
Theorem
First, we need the lemma under the assumption that $\theta$ is an irrational number with bounded partial quotients. This is proved by basic properties of simple continued fraction.
Lemma
Let $D_N$ be the discrepancy of the sequence $(k\theta)$, the fractional part of $k\theta$ modulo $1$, i. e. $$ D_n:=\sup_{0\leq a\leq b\leq 1} \left|\frac1n \#\{1\leq k\leq n: (k\theta) \in (a,b) \} -(b-a)\right|. $$ Then an important inequality for $D_n$ is also needed. This is Theorem 3.4 in Kuipers & Niederreiter 'Uniform Distribution of sequences'
Lemma
We also use Greg Martin's comment in the following form $$ |\cot \pi x|=\frac1{\pi\|x\|}+O(1).$$
Now, split the interval $[0,1]$ into $h+2$ short intervals so that $h+2\asymp \frac n{\log^2 n}$, and $$ \left[0,\frac{\log^2 n}n\right), \left[\frac{\log^2 n}n, \frac{2\log^2 n}n\right), \ldots, \left[\frac{h\log^2 n}n, \frac{(h+1)\log^2 n}n\right), \left[ \frac{(h+1)\log^2 n}n,1\right). $$ Because we are computing $\|k\theta\|$, we only need the first half of these intervals.
By (2), for each $0\leq j\leq h$, the number $i_j(n)$ of elements in the sequence $(k\theta)$ that belongs to $[(j\log^2 n)/n, ((j+1)\log^2 n)/n)$, satisfies $$ \left|i_j(n)- \log^2n \right|=O(\log n). $$ From this and estimates by the right endpoints, we obtain the lower bound. \begin{align} \sum_{k=1}^n |\cot \pi k\theta| &= \sum_{k=1}^n \frac1{\pi\|k\theta\|}+O(n)\\ &\geq \frac2{\pi}\sum_{1\leq j\leq h/2} \frac n{j\log^2 n} (\log^2 n + O(\log n)) +O(n)\\ &\geq \frac2{\pi} n\log n + O(n\log\log n). \end{align}
For the upper bound, we need a more precise estimate on the first short interval.
If $0\leq p < q \leq n$, the we have by (1),
We split $[0, (\log^2 n)/n)$ into $t+2\asymp \log^2 n$ shorter intervals $$ \left[0,\frac1{2cn}\right), \left[\frac1{2cn},\frac2{2cn}\right), \ldots, \left[\frac t{2cn}, \frac{t+1}{2cn}\right), \left[ \frac{t+1}{2cn},\frac{\log^2n}n\right). $$ By (3), each interval contain at most one number of the form $(k\theta)$, with no such number lying in the first interval.
Then we have from the left endpoint estimates, \begin{align} \sum_{k=1}^n |\cot \pi k \theta |&=\sum_{k=1}^n \frac1{\pi\|k\theta\|}+O(n)\\ &\leq \sum_{j\leq 2c\log^2 n} \frac{2cn}j + 2\sum_{j\leq 1+h/2} \frac n{j\log^2 n}(\log^2 n + O(\log n)) + O(n)\\ &=\frac 2{\pi} n\log n + O(n\log\log n). \end{align} Hence, we obtain $$ \sum_{k=1}^n |\cot \pi k \theta |=\frac2{\pi} n\log n+ O(n\log\log n). $$