Suppose we have a random matrix $M \in \mathbb{R}^{n\times m}$ such that $\text{E}[M] = 0$ and $\text{E}[M M^\top] = \Sigma$. How does one compute $\text{E}[M^\top M]$?
2026-03-27 19:42:29.1774640549
Expectation of matrix product
2k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in PROBABILITY
- How to prove $\lim_{n \rightarrow\infty} e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!} = \frac{1}{2}$?
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Prove or disprove the following inequality
- Another application of the Central Limit Theorem
- Given is $2$ dimensional random variable $(X,Y)$ with table. Determine the correlation between $X$ and $Y$
- A random point $(a,b)$ is uniformly distributed in a unit square $K=[(u,v):0<u<1,0<v<1]$
- proving Kochen-Stone lemma...
- Solution Check. (Probability)
- Interpreting stationary distribution $P_{\infty}(X,V)$ of a random process
Related Questions in RANDOM-VARIABLES
- Prove that central limit theorem Is applicable to a new sequence
- Random variables in integrals, how to analyze?
- Convergence in distribution of a discretized random variable and generated sigma-algebras
- Determine the repartition of $Y$
- What is the name of concepts that are used to compare two values?
- Convergence of sequences of RV
- $\lim_{n \rightarrow \infty} P(S_n \leq \frac{3n}{2}+\sqrt3n)$
- PDF of the sum of two random variables integrates to >1
- Another definition for the support of a random variable
- Uniform distribution on the [0,2]
Related Questions in RANDOM-MATRICES
- Distribution of min/max row sum of matrix with i.i.d. uniform random variables
- The Cauchy transform of Marchenko-Pastur law
- Is scaling (related to matrix size $n$) and eigenvalue calculation exchangeable when discussing eigenvalue distribution of random matrix
- What is an Operator Matrix for the operation which happens in the reverse direction?
- Variance of $\mathrm{Proj}_{\mathcal{R}(A^T)}(z)$ for $z \sim \mathcal{N}(0, I_m)$.
- How to simulate a random unitary matrix with the condition that each entry is a complex number with the absolute value 1 in matlab
- Explaining a model that obtain matrice A and B from M by solving optimization problem
- How to bound the L-2 norm of the product of two non-square matrices
- Expected number of operations until matrix contains no zeros.
- How should I proceed to solve the below mentioned non-convex optimisation problem?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Assume that $M=\begin{pmatrix}0&a\\0&0\end{pmatrix}$ where $a$ follows a continuous uniform distribution on $[-1,1]$. Then $E(M^TM)=\begin{pmatrix}0&0\\0&1/3\end{pmatrix}$ and $E(MM^T)=\begin{pmatrix}1/3&0\\0&0\end{pmatrix}$ because $E(a^2)=1/2\int_{-1}^1x^2dx=1/3$.
Yet, if the $(m_{i,j})$ are iid and follow the above distribution, then we find when (for example) $n=2,m=3$
$E(MM^T)=I_2,E(M^TM)=(2/3)I_3$.
EDIT. We can generalize (for any $n,m$) as follows. Assume that the $(m_{i,j})$ are iid and follow the same distribution of probability s.t. $E(m_{i,j})=0,E(m_{i,j}^2)=\sigma$.
If $(i,j)\not=(k,l)$, then $E(m_{i,j}m_{k,l})=E(m_{i,j})E(m_{k,l})=0$.
We obtain $E(MM^T)=m\sigma I_n,E(M^TM)=n\sigma I_m$. Thus, if we know $E(MM^T)$, then we can deduce $E(M^TM)$ (without the knowledge of $\sigma$).