Let $\lambda\in[0,1]$, $\Omega=[0,1]^2$, $\vec{m}$ and $\vec{n}$ be two linearly independent vectors, $i\in\mathbb{N}$ and $h(t)$ the periodic extension of
$$\tilde{h}(t):=\begin{cases} (1-\lambda)t &t\in[0,\lambda)\\ -\lambda(t-1) &t\in [\lambda,1]. \end{cases}$$
I would like to show that $$\mathcal{L}^{2}\Big(\big\{x\in \Omega\;|\;ik\leq i\boldsymbol{x}\cdot \boldsymbol{n} \leq h(i\boldsymbol{x}\cdot \boldsymbol{m})+ik\big\}\Big)\;\leq\tfrac{1}{i},$$ where $k$ is chosen such that the line $\boldsymbol{x}\cdot \boldsymbol{n}=k$ lies still in $\Omega$.
The shape of the region is triangular, but I am not sure how to compute the measure. Since the problem is very specific, I think I need a hint on how to tackle this problem, rather than a solution.
An example would be $$\boldsymbol{m}=\begin{pmatrix} 1 \\ 0 \end{pmatrix},\;\boldsymbol{n}=\begin{pmatrix} 0 \\ 1 \end{pmatrix},\; \lambda=1/2,\;k=1/2,$$I suspect it should be enough to show it on a specific example?