With the help of L'Hopital's rule, I found
$$\lim_{n\to \infty}\left(\frac{\log (n-1)}{\log (n)}\right)^n$$
to be equal to $1$.
How can the limit be found without the use of the rule?
With the help of L'Hopital's rule, I found
$$\lim_{n\to \infty}\left(\frac{\log (n-1)}{\log (n)}\right)^n$$
to be equal to $1$.
How can the limit be found without the use of the rule?
On
$$A=\left(\frac{\log (n-1)}{\log (n)}\right)^n$$ $$\log(A)=n\log\left(\frac{\log (n-1)}{\log (n)}\right)$$ Now $$\log(n-1)=\log\big(n(1-\frac 1n)\big)=\log(n)+\log\big(1-\frac 1n\big)\approx \log(n)-\frac 1n$$ So $$\frac{\log (n-1)}{\log (n)}\approx 1-\frac 1{n \log(n)}$$ $$\log\left(\frac{\log (n-1)}{\log (n)}\right)\approx -\frac 1{n \log(n)}$$ $$\log(A)\approx -\frac 1{ \log(n)}$$ So, $\lim_{n\to \infty} \log(A)=0$ and then $\lim_{n\to \infty} A=1$
On
$$\frac{\ln(n-1)}{\ln n}=\frac{\ln n+\ln(1-\frac1n)}{\ln n}=1+\frac{\ln(1-\frac1n)}{\ln n}$$ and by Bernoulli's inequality $$\left(1+\frac{\ln(1-\frac1n)}{\ln n}\right)^n \ge 1+\frac{n\ln(1-\frac1n)}{\ln n}$$ By the Mean Value Theorem, $\ln(1-\frac1n)=-\frac 1n\ln'\xi$ with $1-\frac1n<\xi<1$. Therefore for $n\ge 2$ $$ 1\ge\left(\frac{\ln(n-1)}{\ln n}\right)^n\ge 1-\frac1{2\ln n}$$
On
Taylor expansion methods and Hospital's Rule methods are morally the same ideas, even though they seem different at first glance ; the Hospital's rule computes the limit by examining the derivatives of the functions involves and Taylor expansion does the same thing. Here is a technique which is fundamentally different, but as you will see, Taylor expansion really makes things much easier ; this is not a very elementary proof.
One can show using integration that the Euler-Mascheroni constant is well-defined : $$ \gamma \overset{def}= \lim_{n \to \infty} \left( \sum_{i=1}^n \frac 1i - \log (n) \right) \in \, ]0,1[ \quad \Longrightarrow \quad \lim_{n \to \infty} \frac{\sum_{i=1}^n \frac 1i}{\log n} = 1. $$ (this is done by using the definition $\log(x) = \int_1^x \frac 1x \, dx$ and comparing $\int_i^{i+1} \frac 1x \, dx$ with $\frac 1i$ and $\frac 1{i+1}$). Therefore, $$ \lim_{n \to \infty} \left( \frac{\log n}{\log(n-1)} \right)^n = \lim_{n \to \infty} \left( \frac{\left(\sum_{i=1}^{n-1} \frac 1i \right) + \frac 1n}{\sum_{i=1}^{n-1} \frac 1i} \right)^n = \lim_{n \to \infty} \left( 1 + \frac{ \left( \sum_{i=1}^{n-1} \frac 1i \right)^{-1} }n \right)^n = \lim_{n \to \infty} \exp \left( \frac 1{\sum_{i=1}^{n-1} \frac 1i} \right) = \exp(0) = 1. $$ where I used the divergence of the harmonic series and the uniform convergence of the sequence $(1+x/n)^n$ to $e^x$ on the interval $[0,1]$. To prove uniform convergence, you could use Arzela-Ascoli's theorem because on $[0,1]$, the functions $(1+x/n)^n$ are all Lipschitz-continuous with Lipschitz constant $\le e$ : by the $a^n-b^n = (a-b) (\cdots)$ identity, $$ \left| \left( 1 + \frac xn \right)^n - \left( 1 + \frac yn \right)^n \right| \le \left| \frac{x-y}n \right| \left( n \left(1 + \frac 1n \right)^{n-1} \right) \le e|x-y|. $$ The existence of a convergence subsequence via Arzela-Ascoli gives the convergence of the whole sequence since it is pointwise increasing on $[0,1]$ (again a detail to prove, which is a computation usually done when showing that $e^x = \lim_{n \to \infty} (1+x/n)^n$ is well-defined ; with a bit of patience and Bernouilli's inequality you can show that the ratio $(1+x/n)^n/(1+x/(n-1))^{n-1}$ is $\ge 1$ on $[0,\infty[$).
So my point is : use Taylor expansion for those things.
Hope that helps,
On
As $$\left(\frac{\log(n-1)}{\log n}\right)^n=\left(\frac{\log n+\log\frac{n-1}{n}}{\log n}\right)^n=\left(1+\frac{\log\left(1-\frac{1}{n}\right)}{\log n}\right)^n, $$ by the Binomial Theorem we have $$\lim_{n\to\infty}\left(\frac{\log(n-1)}{\log n}\right)^n=\lim_{n\to\infty} \sum_{k=0}^n{n\choose k}\left(\frac{\log\left(1-\frac{1}{n}\right)}{\log n}\right)^k \\ =\lim_{n\to\infty} \sum_{k=0}^n\frac{n!}{k!(n-k)!}\left(\frac{\log\left(1-\frac{1}{n}\right)}{\log n}\right)^k, $$which, since $\frac{n!}{(n-k)!}\sim n^k$, is the same as $$\lim_{n\to\infty}\sum_{k=0}^n\frac{1}{k!}\left(\frac{n \log\left(1-\frac{1}{n}\right)}{\log n}\right)^k\\=\lim_{n\to\infty} \sum_{k=0}^n\frac{1}{k!}\left(\frac{-1}{\log n}\right)^k=\lim_{n\to\infty}e^{\frac{-1}{\log n}}=1.$$
On
No Taylor, no L'Hopital: $\ln (n-1) = \ln n -\int_{n-1}^n dx/x.$ So we have
$$ \ln n - 1/n < \ln (n-1) = \ln n -1/(n+1).$$
Thus
$$1-1/(n\ln n)<\ln (n-1)/\ln n < 1-1/[\ln n(n+1)]$$ $$ \implies (1-1/(n\ln n))^n <(\ln (n-1)/\ln n)^n < (1-1/[(n+1)\ln n])^n.$$
Write the power $n$ as $[(n+1)\ln n]\cdot n/[(n+1)\ln n]$ to see the right hand side $\to (1/e)^0 = 1.$ Same idea for the left hand side. So the limit is $1.$
On
If $L$ is the desired limit then \begin{align} \log L &= \log\left\{\lim_{n \to \infty}\left(\frac{\log(n - 1)}{\log n}\right)^{n}\right\}\notag\\ &= \lim_{n \to \infty}\log\left(\frac{\log(n - 1)}{\log n}\right)^{n}\text{ (by continuity of log)}\notag\\ &= \lim_{n \to \infty}n\log\left(\frac{\log(n - 1)}{\log n}\right)\notag\\ &= \lim_{n \to \infty}n\log\left(1 + \frac{\log(1 - 1/n)}{\log n}\right)\notag\\ &= \lim_{n \to \infty}n\cdot\frac{\log(1 - 1/n)}{\log n}\cdot\frac{\log n}{\log(1 - 1/n)}\log\left(1 + \frac{\log(1 - 1/n)}{\log n}\right)\notag\\ &= \lim_{n \to \infty}\frac{1}{\log n}\cdot n\log(1 - 1/n)\cdot\lim_{t \to 0}\frac{\log(1 + t)}{t}\text{ (putting }t = \frac{\log(1 - 1/n)}{\log n})\notag\\ &= 0\cdot (-1)\cdot 1 = 0\notag \end{align} Hence $L = 1$.
You can write $$\frac{\ln(n-1)}{\ln n} = 1+\frac{\ln(1-\frac{1}{n})}{\ln n} = 1+\frac{-\frac{1}{n}+o(\frac{1}{n})}{\ln n} = 1-\frac{1}{n\ln n} + o\left(\frac{1}{n\ln n}\right) $$ using the Taylor expansion of $\ln(1+x)$ around $0$. Then, $$\begin{align} \left(\frac{\ln(n-1)}{\ln n}\right)^n &= e^{n\ln\left(\frac{\ln(n-1)}{\ln n}\right)} = e^{n\ln\left( 1-\frac{1}{n\ln n} + o\left(\frac{1}{n\ln n}\right)\right)} \\ &= e^{n\left( -\frac{1}{n\ln n} + o\left(\frac{1}{n\ln n}\right)\right)} = e^{ -\frac{1}{\ln n} + o\left(\frac{1}{\ln n}\right)} \\ &\xrightarrow[n\to\infty]{} e^{0} = 1 \end{align}$$ (again, same Taylor expansion). The limit follows from $\frac{1}{\ln n} \xrightarrow[n\to\infty]{} 0$ and the continuity of $\exp$.