Functional Derivative of a function

1k Views Asked by At

I want to know the steps involved for the functional derivative that has been taken for the below objective function

Objective Function

And the derivative for above objective function is give below

Derivative

These equations are from Generative Adversarial Networks, http://arxiv.org/pdf/1701.00160.pdf can be found on page 46 and 47. Please help me with the steps for the functional derivative of the objective function.

1

There are 1 best solutions below

4
On BEST ANSWER

Let's redefine the functional as: $$ J[D] = -\frac{1}{2}\mathbb{E}_{x\sim p_{\text{data}}}[\log D(x)] - \frac{1}{2}\mathbb{E}_{x\sim p_{\text{model}}}[\log(1 - D(x))] $$ where $x,G(z)\in X$, so that $$ J[D] = -\frac{1}{2}\left[\int_X \log(D(x))p_{\text{data}}(x) + \log(1 - D(x))p_{\text{model}}(x)\,dx \right] $$ Then the first variation is \begin{align} \frac{\delta J}{\delta D} &= -\frac{1}{2}\frac{\partial}{\partial D}[\log(D(x))p_{\text{data}}(x)] -\frac{1}{2}\frac{\partial}{\partial D}[\log(1 - D(x))p_{\text{model}}(x)] \\[2mm] &= -\frac{1}{2}\left[ \frac{p_{\text{data}}(x)(1-D(x)) - D(x)p_{\text{model}}(x)}{D(x)(1-D(x))} \right] \end{align} We want $D^*(x)$ such that the first variation vanishes. Hence: $$ p_{\text{data}}(x)(1-D^*(x)) - D^*(x)p_{\text{model}}(x) = 0 $$ Therefore: $$ D^*(x) = \frac{p_{\text{data}}(x)}{p_{\text{data}}(x) + p_{\text{model}}(x)} $$