We want to test $H_0: \theta=\theta_0$ against $H_1: \theta=\theta_1$ where $0<\theta_1<\theta_0$ and where $X_1,...,X_n\sim$ Uniform$(0,\theta)$ are IID. We have critical value $c$.
My problem is: I want to justify why the significance level, $\alpha$, must be such that $\alpha\geq\frac{\theta_1^n}{\theta_0^n}$ in the case where $c\leq\frac{\theta_0^n}{\theta_1^n}$
I've computed the Neyman-Pearson test statistic to be $$ T(\textbf{x})=\frac{f(\textbf{x};\theta_1)}{f(\textbf{x};\theta_0)}=\frac{\frac{1}{\theta_1^n}\mathbb{1}\{x_1,...,x_n\in[0,\theta_1]\}}{\frac{1}{\theta_0^n}\mathbb{1}\{x_1,...,x_n\in[0,\theta_0]\}}= \frac{\theta_0^n \mathbb{1}\{x_1,...,x_n\in[0,\theta_1]\}}{\theta_1^n\mathbb{1}\{x_1,...,x_n\in[0,\theta_0]\}} $$
This implies that the power function is $$\mathbb{P}\bigg( \frac{\theta_0^n \mathbb{1}\{x_1,...,x_n\in[0,\theta_1]\}}{\theta_1^n\mathbb{1}\{x_1,...,x_n\in[0,\theta_0]\}}\geq c ;\theta\bigg) $$
Now in the case where $x_1,...,x_n\in[0,\theta_1]$, if $\frac{\theta_0^n}{\theta_1^n}\geq c$ then clearly the power function is $1$, and if $\frac{\theta_0^n}{\theta_1^n}< c$ then clearly it is $0$.
Now in the case where there exists $i$ such that $x_i\in(\theta_1,\theta_0]$ then the power function becomes $\mathbb{P}(0\geq c)=0$.
So going back to the problem: when $c\leq\frac{\theta_0^n}{\theta_1^n}$, how are we able to deduce from what I've done that $\alpha=\mathbb{P}(T(\textbf{X})\geq c;\theta_0)\geq \frac{\theta_1^n}{\theta_0^n} $. Perhaps I've done something wrong because all I am getting is that the power function is either $0$ or $1$ and hence $\alpha$ is either $0$ or $1$?
Your likelihood ratio is
$$\frac{L(\theta_1|\mathbf{x})}{L(\theta_0|\mathbf{x})}=\dots=\begin{cases} \Big(\frac{\theta_0}{\theta_1}\Big)^n, & \text{if $0<x_{(n)}\leq \theta_1$ } \\ 0, & \text{if $\theta_1<x_{(n)}\leq \theta_0$} \end{cases}$$
thus the likelihood ratio is monotonic and we can apply a well known theorem and the critical region is
$$C=\{\mathbf{x};x_{(n)}<k\}$$
Thus
$$\alpha=\int_0^c \frac{n y^{n-1}}{\theta_0^n}dy=\frac{c^n}{\theta_0^n}$$
Now I think it is self evident that you cannot justify your statement.
Counterexample:
Set $\theta_0=3$,$\theta_1=2$, $n=2$
If $c<\Big(\frac{3}{2}\Big)^2=\frac{9}{4}$, say $c=\frac{6}{4}$, $\alpha=\frac{1.5^2}{9}=\frac{1}{4}<\frac{4}{9}$
This is a graphical explanation: The two drawing represent the one sided test with your uniform. I type error (significance level) is the purple area. As you can see,
in the left drawing, that is the case when $\theta_1<\theta_0$ as $c$ decreases, so do $\alpha$.
in the right drawing, that is the case when $\theta_1>\theta_0$ as $c$ decreases, $\alpha$ increases.
Between $c$ and $alpha$ there is an easy relation:
$$\alpha=\frac{c^n}{\theta_0^n}$$
$$\alpha=1-\frac{c^n}{\theta_0^n}$$
respectively, in the two cases.