Hi I am interested in resolving the following problem from the bottom of page 147 from a paper I am revising:
Given a function $$a: \Omega \times \mathbb{R} \times \mathbb{R}^{N} \rightarrow \mathbb{R}^{N}, a(x,u,\xi) = \{a_{i}(x,u,\xi)\}~~i=1,...,N$$
is a Caratheodory vector-valued function, that is, measurable with respect to $x$ in $\Omega$ for every $(s,\xi)$ in $\mathbb{R} \times \mathbb{R}^{N}$ and continuous with respect to $(s,\xi)$ in $\mathbb{R} \times \mathbb{R}^{N}$ for almost every $x$ in $\Omega$. It is also given that $a$ is monotone: $$\sum^{N}_{i=1}[a_{i}(x,s,\xi)-a_{i}(x,s,\xi^{*})](\xi_{i}-\xi_{i}^{*}) > 0$$ for a.e. $x$ in $\Omega$, for every $\xi,\xi^{*} \in \mathbb{R}^{N}$, $\xi \neq \xi^{*}$.
Consider another Caratheodory function $g: \Omega \times \mathbb{R} \rightarrow \mathbb{R}$, such that $g \in L^{1}(\Omega)$ and it satisfies $$g(x,y)y \geq 0$$ for a.e. $x \in \Omega$ and for all $y \in \mathbb{R}$.
If we define continuous linear functionals $Au$ and $Gu$ as: $$\langle Au, v \rangle := \int_{\Omega}a(x,u,\nabla u)\cdot \nabla v dx$$ and $$\langle Gu,v \rangle := \int_{\Omega}g(x,u)vdx$$
for $u,v \in C^{\infty}_{c}(\Omega)$. How would you show that $A + G$ is monotone, which means, $$\langle (A + G)(u_{1})-(A+G)(u_{2}),u_{1}-u_{2}\rangle \geq 0~~~\forall u_{1},u_{2} \in C^{\infty}_{c}(\Omega)$$
Let me know if something is unclear. Thanks for any assistance.
The authors may be using some notion of monotonicity different from what you expect (they never defined it). The inequality $$\langle (A + G)(u_{1})-(A+G)(u_{2}),u_{1}-u_{2}\rangle \geq 0\quad \forall u_{1},u_{2} \in C^{\infty}_{c}(\Omega) \tag{1}$$ does not follow from their assumptions. Indeed, since both $A$ and $G$ could be scaled independently from one another (the constants in their structural assumptions are not related), we can only expect (1) to hold if it holds for $A$ and $G$ separately. But it fails for $G$: the inequality $$ \int_\Omega \left[g(x,u_1(x))- g(x,u_2(x))\right]\left[u_1(x)-u_2(x)\right]\ge 0 \tag{2}$$ need not hold. For example, let $$ g(x,u) = \frac{u}{1+u^2} $$ then the inequality (2) becomes $$ \int_\Omega \frac{(u_1-u_2)^2 (1-u_1u_2)}{(1+u_1^2)(1+u_2^2)}\ge 0 \tag{3}$$ Clearly, (3) is false when we have, say, $u_1=2$ and $u_2=3$ on most of the domain.
Inequality (2) would be true if $g$ was assumed to be increasing with respect to $u$.