Let
- $E$ be a $\mathbb R$-Banach space;
- $\lambda$ be a $\sigma$-finite measure on $E$;
- $\mu$ be a probability measure on $(E,\mathcal E)$ with density $p$ with respect to $\lambda$;
- $q$ be a probability density on $E$ with respect to $\lambda$;
- $(\Omega,\mathcal A,\operatorname P)$ be a probability space;
- $(X_t)_{t\ge0}$ be an $E$-valued right-continuous Markov process on $(\Omega,\mathcal A,\operatorname P)$ with invariant distribution $\mu$;
- $c_0>0$ and $$c:=c_0\frac qp;$$
- $$A_t:=-\int_0^tc(X_s)\:{\rm d}s$$ and $$M_t:=e^{-A_t}$$ for $t\ge0$.
I want to choose $c_0>0$ such that $$\Phi(c_0):=c_0^2\int_0^\infty\operatorname E\left[M_tc(X_t)\left|\int_0^tf(X_s)-\int f\:{\rm d}\mu\:{\rm d}s\right|^2\right]\:{\rm d}t$$ is as small as possible.
Simply letting $c_0\to0+$ doesn't seem to be the solution, since (contrary to the other terms) $M_t\to0+$ as $c_0\to\infty$.
I thought using the Lagrange multiplier theorem should be the solution, but this seems to give a rather complicated expression. If real minimization of $\Phi$ is too complicated, I would also be interested in a choice which makes $\Phi$ sufficiently small.
Analysis of Φ(c0)
We aim to find the minimum of the function Φ(c0) by differentiating with respect to c0 and setting the result to zero.
From the provided information:
\[ \Phi(c_0) = c_0^2 + \int_0^\infty E \left[ M_t c(X_t) \int_0^t f(X_s) ds - \int_0^t f(X_s) ds \right]^2 dt \]Step 1: Differentiation of c02
\[ \frac{d}{dc_0} c_0^2 = 2c_0 \]
Step 2: Differentiate the terms inside the integral
a) Differentiation of Mt
we have:
\[ \frac{dM_t}{dc_0} = \frac{d}{dc_0} e^{-\int_0^t c(X_s) ds} \]
Utilizing the chain rule and the relation \:
\[ \frac{dM_t}{dc_0} = M_t \left( \int_0^t \frac{q(X_s)}{p(X_s) \times c_0^2} ds \right) \]
b) Differentiation of c(Xt)
From the relation \(c = \frac{q}{p \times c_0}\):
\[ \frac{dc(X_t)}{dc_0} = -\frac{q(X_t)}{p(X_t) \times c_0^2} \]
Step 3: Incorporate these results into the derivative of the integral term
Let's denote the function inside the expectation as \(G(c_0)\). Using the chain rule:
\[ \frac{dG}{dc_0} = 2 E[g(X_t, c_0)] \times E\left[ \frac{\partial g(X_t, c_0)}{\partial c_0} \right] \]
Substituting in the differentiated terms from step 2:
\[ \frac{dG}{dc_0} = 2 E \left[ M_t c(X_t) \int_0^t f(X_s) ds - \int_0^t f(X_s) ds \right] \times E\left[ \left( \int_0^t \frac{q(X_s)}{p(X_s) \times c_0^2} ds \right) \left( -\frac{q(X_t)}{p(X_t) \times c_0^2} \right) \right] \]
Step 4: Assemble the entire expression
\[ \frac{d\Phi(c_0)}{dc_0} = 2c_0 + \int_0^\infty \frac{dG}{dc_0} dt \]
To find the minimum, set the above to zero and solve. The solutions will provide potential minimum points, which can be further evaluated to determine the actual minimum.