Let $X$ be a log-normal distribution and consider $Y=aX+b$ for some $a,b>0$.
I would like to know if one can compute $$\mathbb{E}[\log(Y)]$$
This would be very easy if it was $b=0$, since in this case $Y=aX$ is lognormal and, in turn, $\log(Y)$ would be normal.
Question Is the computation feasible in general?
I am interested in the problem since I would like to solve $$\max_{\pi\in [0,1]}\mathbb{E}[\log(\pi(X-1)+1)]$$
Question How to solve the maximization problem?
I will now present my take at it.
My attempt Consider the Taylor series $\log(1+x)=\sum_{r=0}^\infty\frac{(-1)^r}{1+r}x^{r+1}$ and let $Z=\pi X$ (which is log-normal). Then we can write $$\log(\pi(X-1)+1)=\sum_{r=0}^\infty\frac{(-1)^r}{1+r}[Z-\pi]^{r+1}$$ Eploiting the binomial formula $(z-\pi)^{r+1}=\sum_{k=0}^{r+1}\binom{r+1}{k}z^{n+1-k}(-\pi)^k$ I can rewrite $$\mathbb{E}[\log(\pi(X-1)+1)]=\sum_{r=0}^{\infty}\sum_{k=0}^{r+1}\frac{(-1)^{r+k}}{r+1}\binom{r+1}{k}\pi^k\mathbb{E}[Z^{r+1-k}]$$ Now, letting $\mu,\sigma^2$ be the parameters of $X$, we know that the parameters of $Z$ are $\mu+\log(\pi),\sigma^2$. Using the formula for the arithmetic moments of a log-normal random variable, we can conclude: $$\mathbb{E}[\log(Y)]=\mathbb{E}[\log(\pi(X-1)+1)]=$$ $$=\sum_{r=0}^{\infty}\sum_{k=0}^{r+1}\frac{(-1)^{r+k}}{r+1}\binom{r+1}{k}\pi^k e^{(r+1-k)(\mu+log(\pi))+\frac{1}{2}(r+1-k)^2\sigma^2}=$$
$$=\sum_{r=0}^{\infty}\sum_{k=0}^{r+1}\left[\frac{(-1)^{r+k}}{r+1}\binom{r+1}{k}\underbrace{e^{(r+1-k)\mu+\frac{1}{2}(r+1-k)^2\sigma^2}}_{\mathbb{E}[X^{r+1-k}]}\right]\pi^{r+1}$$
I would be interested if the attempt is correct and if this series may be computed (which I doubt). Maximizing with respect to $\pi$ is non trivial (the coefficients make the problem alternate in sign). How would you proceed?