Derivative of conditional probability on condition

661 Views Asked by At

How do you calculate this derivative of a conditional probability, $\frac{d}{dx}p(y|x)$?

Similarly, when $\vec x$ is a vector, what's the derivative of $\frac{\partial}{\partial x_i}p(y|\vec x)$?

Context:

Maximize expected utility in the Z-goods theory. Goods $\vec x$ are a means to achieve higher purposes $\vec y$, with subjective probabilities $p(\vec y|\vec x)$. Utility function for higher purposes is $v(\vec y)$. Expected utility of the means $\vec x$ is:

$$ u(\vec x) = \int v(\vec y)p(\vec y|\vec x) d \vec y $$

Add a set of constraints:

$$B_k = f_k(\vec x)$$

Lagrangian:

$$ L(\vec x, \vec \lambda) = u(\vec x) + \sum_k \lambda_k (B_k - f_k(\vec x)) $$

Maximizing $L$ you get:

$$ \frac{\partial L}{\partial x_i} = 0 \implies \int v(\vec y) \Big[\frac{\partial}{\partial x_i} p(\vec y|\vec x)\Big] d \vec y = \sum_k \lambda_k \frac{\partial}{\partial x_i}f_k(\vec x) $$

Hence my question: How do you calculate the derivative of a conditional probability on condition?

1

There are 1 best solutions below

1
On

For the first question: You just treat $p(y|x)$ as a function of two variables. If it helps, write it as $$g(y,x) = p(y|x).$$ You know how to take the partial derivative $\frac{\partial}{\partial x}g(y,x)$ if you have an expression for the function?

For the second question the answer is quite similar. If $y$ is a scalar and $\vec x = (x_1,\ldots,x_n)$, you treat $p(y|\vec x)$ as a function of $n+1$ variables: $$ g(y, x_1, \ldots, x_n) = p(y | \vec x) $$ and take the partial derivative with respect to $x_i$.