I have this random discrete variable $X$ such that $E(X)=\mu$. Is there a closed form for the value of $E(\log((x+1)!))$? If not, is there a lower bound for it, for example using Jensen inequality? I do not know how $X$ is distributed.
2026-04-19 11:50:13.1776599413
Lower-bound for $E(\log((x+1)!))$?
101 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in DISCRETE-MATHEMATICS
- What is (mathematically) minimal computer architecture to run any software
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- The function $f(x)=$ ${b^mx^m}\over(1-bx)^{m+1}$ is a generating function of the sequence $\{a_n\}$. Find the coefficient of $x^n$
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Given a function, prove that it's injective
- Surjective function proof
- How to find image of a function
- Find the truth value of... empty set?
- Solving discrete recursion equations with min in the equation
- Determine the marginal distributions of $(T_1, T_2)$
Related Questions in EXPECTED-VALUE
- Show that $\operatorname{Cov}(X,X^2)=0$ if X is a continuous random variable with symmetric distribution around the origin
- prove that $E(Y) = 0$ if $X$ is a random variable and $Y = x- E(x)$
- Limit of the expectation in Galton-Watson-process using a Martingale
- Determine if an Estimator is Biased (Unusual Expectation Expression)
- Why are negative constants removed from variance?
- How to find $\mathbb{E}(X\mid\mathbf{1}_{X<Y})$ where $X,Y$ are i.i.d exponential variables?
- $X_1,X_2,X_3 \sim^{\text{i.i.d}} R(0,1)$. Find $E(\frac{X_1+X_2}{X_1+X_2+X_3})$
- How to calculate the conditional mean of $E(X\mid X<Y)$?
- Let X be a geometric random variable, show that $E[X(X-1)...(X-r+1)] = \frac{r!(1-p)^r}{p^r}$
- Taylor expansion of expectation in financial modelling problem
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Assuming the random variable $x$ is always an non-negative integer and $E[x] =\mu$,
$f(x) = \log( (x+1)!)$ is convex, so $E[f(X)] \geq f(\mu)$ by Jensen.
(It's not to hard to see that $f:Z\rightarrow R$ is convex. Note that $$f(x) =\log((x+1)!) = \log(x+1) + \log(x!) = \log(x+1) + f(x-1).$$ So, $g(x) = f(x) -f(x-1) = \log(x+1)$ is strictly increasing which implies convexity.)
Furthermore, this lower bound is sharp because the distribution could be $P(x=n) = 1$ for some positive integer $n$ in which case,
$E[f(x)] = f(n) = f(\mu)$.
If you only know the mean of the distribution and nothing else, then there is no upper bound on $E[f(x)]$.
For example, if you look at the distributions where $P(x=0) = 1- 1/n$ and $P(x=n) = 1/n$ where $n$ is a positive integer, then $\mu = 1$ and $$E[f(x)] = \frac{1}{n} \log((1+n)!)$$ $$ \geq \frac{1}{n} \log\left( \left(\frac{n+1}{e}\right)^{n+1}\right)$$ $$ > \frac{1}{n} \log\left( \left(\frac{n}{e}\right)^{n}\right)$$ $$ > \log\left( \frac{n}{e}\right).$$ (Using Sterling for the lower bound on $(n+1)!$.)
That shows that $E[f(x)]$ can be arbitrarily large if the only thing you know is $E[f(x)]$.
If you know the standard deviation of $x$, then you can bound $E[f(x)]$ with Taylor series. The first derivative of $f(x)$ is the polygamma function of order 0 evaluated at $x+2$.
$$f'(x) = \psi(x+2)= \frac{\Gamma'(x+2)}{\Gamma(x+2)}.$$
The second derivative is the polygamma function of order 1 evaluated at $x+2$. $$f''(x) = \psi^{(1)}(x+2).$$
The polygamma function of order 1 is strictly decreasing and positive for positive real numbers, so $$\max_{x\geq0} f''(x)= f''(0) = \psi^{(1)}(2)= \frac{\pi ^2}{6}-1< 13/20.$$
Taylor series implies for any function $f:[0,\infty)\rightarrow R$, $$f(x) \leq f(\mu) + f'(\mu)(x-\mu) + \max_{z\geq0}f''(z)(x-\mu)^2/2,\mathrm{\ so}$$ $$E[f(x)] \leq f(\mu) + \left(\frac{\pi ^2}{6}-1\right)\sigma^2/2$$ where $\sigma^2 = E[(x-\mu)^2].$
If you can bound $x$, you can tighten the bounds on $E[f(x)]$ using similar techniques. If $b_1\leq x \leq b_2$, then $$f(\mu) +\psi^{(1)}(b_2+2) \sigma^2/2\leq E[f(x)] \leq f(\mu) +\psi^{(1)}(b_1+2) \sigma^2/2.$$
For example, if $P(x=10) = 1/2 = P(x=12)$, then $\mu = 11$, $b_1=10$, $b_2=12$, $\sigma=1$,
$$E[f(x)] = (\log(11!) + \log(13!))/2 \approx 20.027$$ $$f(\mu) = f(11) \approx 19.987$$ $$f(\mu) +\psi^{(1)}(b_2+2) \sigma^2/2 \approx 20.024$$ $$f(\mu) +\psi^{(1)}(b_1+2) \sigma^2/2 \approx 20.031$$