I am following Morris-tenenbaum ordinary differential equation which has the proof for the sufficient condition for the exactness of a DE,
Given, $\frac{\partial {f(x,y)}}{\partial {x}} = P(x,y)$,
$\frac{\partial {f(x,y)}}{\partial {y}} = Q(x,y)$
So, $f(x,y)=\int_{x_0}^{x} P(x,y) dx +R(y)$
I do not get what exactly what this integration would mean since it is integrating from a constant to a variable.
2026-04-25 08:07:08.1777104428
Meaning of $\int_{x_0}^{x} P(x,y) dx$ since $x$ is a variable
45 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
Assume that $P$ and $Q$ are defined all over the place. We are looking for a function $$f:\>(x,y)\mapsto f(x,y)\in{\mathbb R}$$ satisfying $${\partial f\over\partial x}=P(x,y),\quad {\partial f\over\partial y}=Q(x,y)\qquad\forall\,(x,y)\ .$$ At this point the author makes the Ansatz $$f(x,y):=\int_{x_0}^xP(t,y)\>dt+g(y)\ ,\tag{1}$$ with a new unknown function $g$ which depends on only one variable.
For any given $x_0$ and $y$ the integral $\int_{x_0}^xP(t,y)\>dt$ is well defined, and depends on $x$. You can think of it in the following way: The function $P$, depending on $x$ and the "parameter" $y$, is integrated along the horizontal segment going from $(x_0,y)$ to $(x,y)$.
The question now is: Can a function $f$ of the "special form" $(1)$ solve the given problem, when $g$ is chosen suitably? This question is dealt with on the succeeding lines of your book, I hope.