Background (skip to the gray if you prefer). In Legendre's 1798 work on number theory he conjectured that $\pi(x)\sim \frac{x}{\log x - A}$ in which he proposed that $A = 1.08366.$ Gauss disputed the value of the constant in a letter written in 1849 and then Chebyshev published in 1851 a paper in which he asserted the correct value of A is 1 (and the coefficient of $\log x$ should be 1). In passing, Chebyshev compares the logarithmic integral to Legendre's expression. He says, in part:
"...we can show that Legendre's formula [above], for which the limit of the expression
$$\frac{\log^2 x}{x}\cdot\left(\frac{x}{\log x - 1.08366}-\int_2^x \frac{dx}{\log x} \right)$$
is $0.08366$ when $x = \infty$..."$\hspace{30mm}(1)$
Question. Is (1) true, and if not can we guess how he arrived at this value?
I see that his log integral is equivalent to $li(x) - li(2)$ and I assume no coincidence that $0.08366 = 1.08366 - 1.$
Note: I do not claim the statement is true and present it as I understand it. I think $\log^2 x$ is unambiguously $(\log x)^2$ since is uses $\log\log x$ elsewhere. Thanks for any insights.
Source: The grayed portion can be found at p. 41 of Chebyshev's Oeuvres.
Some work.
Edit: Mathematica suggests this is true (using very large numbers).
Edit: Things I would expect to be useful but haven't quite put together:
$Li(x) = \frac{x}{\log x }(1 + \sum_{k=1}^{n-1}\frac{k!}{\log^k x})+n!\int_1^x\frac{dt}{\log^{n+1}t}+C_n$ with n independent of x.
$\int_2^x\frac{dt}{\log t} = Li(x)-Li(2)$
$\sum_1^\infty r^k = \frac{r}{1-r}$ (letting $r = \log x$ ?)
Less useful I think: $\frac{1}{x}\int_2^x\frac{dt}{\log t}\to0~ \text{as}~ x\to \infty.$
Edit: In the expansion of Li(x) and Li(2) I think we can justify dropping all but lower-order terms, which simplifies things a great deal. We already have an expression that Mathematica's "limit" operation recognizes as converging to 0.08366.
If (for example) we simplify Li as
$li(x) \approx \frac{x}{\log x }(1 + \frac{1}{\log x})$ and similarly
$li(2) \approx \frac{2}{\log 2}(1 + \frac{1}{\log 2}),$
and call
$e_1 = \frac{\log^2 x}{x}, e_2 = \frac{x}{\log x - 1.08366}$ then
$E(x) = e_1(e_2 - [li(x) - li(2)])$
then
$$E(x) =\frac{ 1.08366 x + 0.08366 x \log x - 7.637 \log^2 x + 7.048 \log^3 x}{x(\log x - 1.08366)} $$
and I think we can show that $\lim_{x\to\infty} E(x) = 0.08366$ keeping in mind that $\frac{\log^n x}{x}$ gets small for finite n and sufficiently large x.
So I am convinced that Chebyshev's assertion is true but would still appreciate a proper answer to this (thanks!).
Let $\alpha = 1.08366...$
You are asking whether $$\lim_{x\to \infty} \frac{(\log x)^2}{x}\left(\frac{x}{\log x - \alpha}-\int_{2}^{x}\frac{dx}{\log x}\right)=\alpha - 1$$
Indeed, this result holds for all real $\alpha$. To see this, start by looking at the Taylor series for $\frac{1}{1-t}$ where $t$ is chosen suitably to obtain: $$\frac{x}{\log x - \alpha} = \frac{x}{\log x} + \alpha \frac{x}{(\log x)^2} + O\left(\frac{x}{(\log x)^3}\right)$$
Likewise, from the series you supplied $$\text{Li}(x)=\frac{x}{\log x} + \frac{x}{(\log x)^2} + O\left(\frac{x}{(\log x)^3}\right)$$
The constant term $\text{Li}(2)$ is asymptotically negligible (and $2$ can be replaced with any real $>1$). So $$\frac{x}{\log x - \alpha}-\int_{2}^{x}\frac{dx}{\log x}=(\alpha-1)\frac{x}{(\log x)^2} + O\left(\frac{x}{(\log x)^3}\right)$$ and hence $$\lim_{x\to \infty} \frac{(\log x)^2}{x}\left(\frac{x}{\log x - \alpha}-\int_{2}^{x}\frac{dx}{\log x}\right)=\alpha - 1$$