Is it possible to solve a summation with a variable base of log?
$$ S_n = \sum_{i = 2}^{n}\log_i{(n)} $$
Should I use the derivative of $\log_i{(n)}$?
Is it possible to solve a summation with a variable base of log?
$$ S_n = \sum_{i = 2}^{n}\log_i{(n)} $$
Should I use the derivative of $\log_i{(n)}$?
On
A simple (perhaps useful, perhaps not) bound via Jensen inequality. Because $1/\log(x)$ is convex:
$$S_n = \sum_{i = 2}^{n}\log_i{(n)}= (n-1) \log(n) \frac{\sum_{i = 2}^{n} \frac{1}{\log(i)}}{n-1} \ge (n-1) \log(n) \frac{1}{\log \frac{\sum i}{n-1}}= \frac{(n-1) \log(n)}{\log(\frac{n}{2}+1)} $$
This bound seems quite decent. For large $n$ it tends to $n$ (the same asymptotic found by Eric Naslund - notice however that the convergence is slow)

On
Since $\log_{i} n = \frac {\log n} {\log i}$, we have $$S_n = \sum_{i = 2}^{n} \frac {\log n} {\log i} = \log n \sum_{i = 2}^{n} \frac {1} {\log i}$$ and we have by Euler-McLaurin summation formula $$\sum_{i = 2}^{n} \frac {1} {\log i} = \int_{2}^{n} \frac {\text {d} x} {\log x} + \log \sqrt {2n} + O \left(\frac {1} {\log n}\right),$$ which leads us to $$S_n = \text {Li} (n) \log n + \frac {1} {2} \log^2 n + \log \sqrt {2} \log n + O (1).$$ Hope this helps.
While I don't believe there is a nice closed form for $S_n$, you can write the sum in terms of known functions and constants up to a very small error. Specifically, $$\sum_{i=2}^{N}\log_{i}(N)=\text{li(N)}\log N+C\log N+O(1),$$ where $\text{li}(N)$ is the logarithmic integral and $C$ is a constant equal to $$C=\frac{1}{\log2}+\int_{2}^{\infty}\frac{\{x\}}{x\log^{2}x}dx.$$
Proof: Writing $$\sum_{i=2}^{N}\log_{i}(N)=\log N\sum_{i=2}^{N}\frac{1}{\log i},$$ our goal is then to find an asymptotic for the sum of $1/\log i$. Writing this as a Riemann Stieltjies integral we have $$\sum_{i=2}^{N}\frac{1}{\log i}=\int_{2^{-}}^{N^{+}}\frac{1}{\log x}d\left[x\right]=\int_{2}^{N}\frac{1}{\log x}dx-\int_{2^{-}}^{N^{+}}\frac{1}{\log x}d\left\{ x\right\}.$$ By integration by parts, $$\int_{2^{-}}^{N^{+}}\frac{1}{\log x}d\left\{ x\right\} =\frac{\left\{ x\right\} }{\log x}\biggr|_{x=2^{-}}^{x=N^{+}}+\int_{2}^{N}\frac{\left\{ x\right\} }{x\log^{2}x}dx$$ $$=\frac{1}{\log2}+\int_{2}^{\infty}\frac{\{x\}}{x\log^{2}x}dx-\int_{N}^{\infty}\frac{\left\{ x\right\} }{x\log^{2}x}dx,$$ and since $$\int_{N}^{\infty}\frac{\{x\}}{x\log^{2}x}dx=O\left(\frac{1}{\log N}\right),$$ we have that $$\sum_{i=2}^{N}\frac{1}{\log i}=\text{li}(N)+C+O\left(\frac{1}{\log N}\right)$$ where $\text{li}(N)$ is the logarithmic integral and $$C=\frac{1}{\log2}+\int_{2}^{\infty}\frac{\{x\}}{x\log^{2}x}dx.$$ Thus it follows that $$\sum_{i=2}^{N}\log_{i}(N)=\text{li(N)}\log N+C\log N+O(1).$$ (Note that the asymptotic is then $\sum_{i=2}^{N}\log_{i}(N)\sim N.$)
Remark: In fact, we could apply integration by parts again to work out the $O(1)$ term exactly and evaluate the sum up to an error of $O\left(\frac{1}{N}\right)$. This general process of writing the sum $\sum_{k\leq N} f(k)$ as a series whose main term is $\int_{1}^N f(x)dx$ is known as Euler-Maclaurin summation.