The definition of an asymptotic variance says:
For sequence of estimators $\mathbf{U}=(U_1, U_2,\ldots)$, where: $U_i=U_i(X_1,\ldots,X_i)$, if for a sequence of constants $\{k_n\}$: $$k_n(U_n-\theta)\xrightarrow{d.}\mathcal{N}(0,\sigma^2)$$ then $\sigma^2$ is the asymptotic variance.
Therefore is this true that an asymptotic variance exists if and only if a normalised estimator (by $k_n$) converges to a normal distribution?
What if an estimator converges to a distribution that is not normal? Then we say the asymptotic variance doesn't exist?
Let's consider estimating $\theta$ when $X_1,X_2,X_3,\ldots$ are independent and uniformly distributed in the interval $(0,\theta)$. The likelihood based on the first $n$ observations is $$ \begin{align} L(\theta) & = \begin{cases} \theta^n & \text{if } \theta\ge (\text{all of } X_1,\ldots,X_n), \\ 0 & \text{if } (\text{at least one of }X_1,\ldots,X_n) < \theta. \end{cases} \\[10pt] & = \begin{cases} \theta^n & \text{if }\max\{X_1,\ldots,X_n\} \le\theta, \\ 0 & \text{if }\max>\theta. \end{cases} \end{align} $$ Hence the maximum-likelihood estimator is $\widehat \theta = \max\{X_1,\ldots,X_n\}$.
Unlike the usual situation with normal distributions one multiplies by $n+1$ rather than by the more slowly growing $\sqrt n$. The variance of an exponential distribution is proportional to the square of its mean.
Sketch of proof of the proposition: Consider the discrete probability distribution of the number of random variables $\theta - X_1,\ldots, \theta - X_n$ that fall within the interval $(0,\theta t/n)$. This is the number of "successes" in $n$ independent trials with probability $t/n$ of success on each trial. This is a binomial distribution with parameters $n$ and $t/n$. As $n\to\infty$ its expected value remains equal to $t$ and hence its distribution approaches a Poisson distribution with expected value $t$. The distribution of the waiting time until the first arrival in a Poisson process is exponential. $\qquad\blacksquare$