How to understand this probability equation?

48 Views Asked by At

$\mathbf { x } ( t ) = g ( \mathbf { s } ( t ) ; \xi ) + \mathbf { n } ( t )$, where n(t) denotes the noise or modeling error and ξ the parameters of mapping $g$

How to understand the following probability equation?

$p _ { \mathbf { x } } ( \mathbf { x } ( t ) | \mathbf { s } ( t ) , \xi ) = p _ { \mathrm { n } } ( \mathbf { x } ( t ) - g ( \mathbf { s } ( t ) ; \xi ) )$, where $p_n$ denotes the probability density function of the noise term $n(t)$.

It looks reasonable, but how to proof it?

More can be find at section 2.2 Latent variable models https://arxiv.org/pdf/1411.7783.pdf.

1

There are 1 best solutions below

0
On BEST ANSWER

In words, that equation states that given $\mathbf{s}(t)$ and the parameter $\xi$, the probability that your random variable/vector (let's call it $\mathbf{X}$) takes on the value $\mathbf{x}(t)$ is the same as the probability that your noise (let's call that $\mathbf{N}$) takes on the value $\mathbf{x}(t) - g(\mathbf{s}(t);\xi)$. That is, if we know $g(\mathbf{S}(t);\xi)$ (by knowing $\mathbf{S}$ and $\xi$) then we can find the noise in terms of $\mathbf{x}$.

To prove that, we use the definition of the probability mass function (PMF), while noting that $\mathbf{X}(t) = g(\mathbf{S}(t);\xi) + \mathbf{N}(t)$ (as a relationship between random variables).

\begin{align} p_{\mathbf{X}}(\mathbf{x}(t) | \mathbf{s}(t), \xi) &= \mathbb{P}(\mathbf{X} = \mathbf{x}(t) | \mathbf{s}(t), \xi) \\ &= \mathbb{P}(\mathbf{X} = g(\mathbf{S};\xi)+\mathbf{N} = \mathbf{x}(t) | \mathbf{S} = \mathbf{s}(t), \xi) \\ &= \mathbb{P}(\mathbf{N} = \mathbf{x}(t) - g(\mathbf{s}(t);\xi) | \mathbf{S} = \mathbf{s}(t), \xi) \\ &= \mathbb{P}(\mathbf{N} = \mathbf{x}(t) - g(\mathbf{s}(t);\xi)) \ (*)\\ &= p_{\mathbf{N}}(\mathbf{x}(t) - g(\mathbf{s}(t);\xi). \end{align}

One thing to note is that going to $(*)$ from the equation above assumes that $\mathbf{S}$ and $\xi$ are independent of $\mathbf{N}$.