Suppose $f:\mathbb R^+\to \mathbb R^+$ is a monotone increasing function and $X\ge 0$. Now clearly $\operatorname{Cov}(X,f(X))\ge 0$ but except $X$ is constant can there be equality? I really appreciate if someone could provide an example.
2026-04-06 11:16:05.1775474165
Positive correlation of random with itself under monotone transformation
55 Views Asked by user727263 https://math.techqa.club/user/user727263/detail At
1
There are 1 best solutions below
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in PROBABILITY-DISTRIBUTIONS
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- Statistics based on empirical distribution
- Given $U,V \sim R(0,1)$. Determine covariance between $X = UV$ and $V$
- Comparing Exponentials of different rates
- Linear transform of jointly distributed exponential random variables, how to identify domain?
- Closed form of integration
- Given $X$ Poisson, and $f_{Y}(y\mid X = x)$, find $\mathbb{E}[X\mid Y]$
- weak limit similiar to central limit theorem
- Probability question: two doors, select the correct door to win money, find expected earning
- Calculating $\text{Pr}(X_1<X_2)$
Related Questions in COVARIANCE
- Let $X, Y$ be random variables. Then: $1.$ If $X, Y$ are independent and ...
- Correct formula for calculation covariances
- How do I calculate if 2 stocks are negatively correlated?
- Change order of eigenvalues and correspoding eigenvector
- Compute the variance of $S = \sum\limits_{i = 1}^N X_i$, what did I do wrong?
- Bounding $\text{Var}[X+Y]$ as a function of $\text{Var}[X]+\text{Var}[Y]$
- covariance matrix for two vector-valued time series
- Calculating the Mean and Autocovariance Function of a Piecewise Time Series
- Find the covariance of a brownian motion.
- Autocovariance of a Sinusodial Time Series
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
No, the equality holds iff $X$ is almost surely constant.
Here is a proof. Let $Y$ be independent from $X$ but have the same distribution. Since $f$ is increasing, we have $(Y-X)(f(Y)-f(X))\ge0$. By taking the expected value (provided that it makes sense, i.e. the random variables are sufficiently integrable), we get $$ \begin{align*} \mathbb E[(Y-X)(f(Y)-f(X))]&=\mathbb E[Yf(Y)-Xf(Y)-Yf(X)+Xf(X)]\\ &=\mathbb E[Xf(X)]+\mathbb E[Yf(Y)]-\mathbb E[Xf(Y)]-\mathbb E[Yf(X)]\\ &\ge0. \end{align*} $$
Using the fact that $\mathbb E[Xf(X)]=\mathbb E[Yf(Y)]$ (since $X$ and $Y$ have the same distribution) and $\mathbb E[Xf(Y)]=\mathbb E[X]\mathbb E[f(Y)]=\mathbb E[Yf(X)]$ (by independence), we deduce that $$ \mathbb E[(Y-X)(f(Y)-f(X))]=2\operatorname{cov}(X,f(X))\ge0, $$ with equality iff $(Y-X)(f(Y)-f(X))=0$ almost surely. The later can be rewritten as almost surely: $Y=X$ or $f(Y)=f(X)$. But since $f$ is increasing (I suppose strictly increasing) and therefore one-to-one, $f(Y)=f(X)\implies Y=X$. So if the covariance is zero then $Y=X$ almost surely, which implies that $$ \mathbb E[X^2]=\mathbb E[XY]=\mathbb E[X]\mathbb E[Y]=\mathbb E[X]^2, $$ and therefore $\operatorname{Var}(X)=0$, or equivalently, $X$ is almost surely constant.