if $X\mid Y$ follows Bernoulli with parameter $g(Y)$ then what is $E[X]$?

38 Views Asked by At

The context is not important for the question but nevertheless here it is: $A$ is the adjacency matrix of a random simple graph (A is symmetric with zero diagonal and with entries in $\{0,1\}$). The graph is generated on a fixed number of vertices, $n$ and the edges are added randomly, each edge is added with a certain probability which may or may not be dependent on the vertex. In this case it is dependent via some function $g:\Bbb R\rightarrow\Bbb R$.

Concretely we have $A_{i,j}\mid\xi \sim\text{ Bernoulli}(g(\xi_i)g(\xi_j))$ with $\xi_i\sim \text{Uniform}$ iid. In other words, given the vector $\xi$, an edge $(i,j)$ is added to the graph with probability $g(\xi_i)g(\xi_j)$.

At some point we have to calculate the probability that the size $2$ path made of $2$ edges and $3$ vertices, which we call $F$, is a subgraph of the randomly generated graph $G$ (which corresponds to the matrix A). So we have $P(F\subset G)=E(\mathbb 1(F\subset G))=E(\prod\limits_{e\in E(F)}A_e)=E(\ (g(\xi_1)g(\xi_2)g(\xi_2)g(\xi_3)\ )$. ($E(F)$ denotes the edge set of $F$). I don't understand this last equality

Anyway here is the simplified question:

Let $X$ be a random variable and $y$ be an unknown parameter drawn from some distribution such that $X\mid y\sim\text{Bernoulli}(g(y))$ for some function $g$ with $\text{image}(g)\subset[0,1]$. Then is it true that $E[X]=E[g(y)]$ and why?

1

There are 1 best solutions below

0
On BEST ANSWER

This is true due because expectations can be calculated according to the Tower Law of Expectation: $$\mathbb{E}[X] = \sum_{y} \mathbb{E}[X ~|~ Y = y] \cdot \mathbb{P}(Y=y)$$ In other words, you can calculate unconditioned expectations by finding the sum of the conditional expectations weighted by the probability of the conditioned event. Here, $\mathbb{E}[X ~|~ Y = y] = g(y)$, as $X ~|~ Y$ has a Bernoulli distribution (and the expectation is the probability of success). Thus, $$\mathbb{E}[X] = \sum_{y} \mathbb{E}[X ~|~ Y = y] \cdot \mathbb{P}(Y=y) = \sum_{y} g(y) \cdot \mathbb{P}(Y=y)$$ which is the definition of $\mathbb{E}[g(y)]$. $\square$