Problem
I'm a little confused by the arguments made in Chapter 1, Section 1.2 of R. Courant and D. Hilbert's Methods of Mathematical Physics, Vol. 2. The author asks
"Can one construct a partial differential equation in $n$ independent variables which is satisfied by a family of functions depending on an arbitrary $n-1$ independent variables?"
However, he gives the following family of functions as an example
$$u = f(x, y, w(g(x,y)))$$
where $w$ is an arbitrary function and $g$ depends on the two independent variables, not one.
Taking the partial derivatives of $u$, he gets
$$u_x = f_x + f_ww'g_x$$ $$u_y = f_y + f_ww'g_y$$
and eliminating $w'$ arrives at
$$(u_x-f_x)g_y - (u_y-f_y)g_x = 0.$$
The author then states the above PDE is quasilinear, since it involves the derivatives linearly. However, rewriting, we find
$$g_yu_x-g_xu_y=g_yf_x-g_xf_y$$
and since $g$ is a function of $x$ and $y$ only (the independent variables) and since $f_x$ and $f_y$ are generally functions of $x,y,$ and $w$, it seems this equation should be linear?
Questions
- Why is $g$ a function of both independent variables?
- How is the PDE quasilinear in $u$?
Insight to this question might be had from Mathematical Methods for Physics and Engineering: A Comprehensive Guide by K. F. Riley, et al, Section 20.2. However, I would appreciate comments as to whether this line of reasoning is correct.
Here they talk about attempting to form the general solution to a PDE by writing $u$ "as a function (however complicated) of a single variable, $p$, itself a simple function of $x$ and $y$." The purpose of this, as in Courant & Hilbert, is to see if there is an analogue to the general form of solutions for ODEs to PDEs. Namely, an $n$th-order ODE can always result from the elimination of $n$ arbitrary constants from its solution, so the author is investigating if the same is true for PDEs by elimination of $n$ arbitrary functions.
He gives the following example:
$$u_1(x,y) = x^4 + 4\left(x^2y+y^2+1\right),$$ $$u_2(x,y) = \sin x^2 \cos 2y + \cos x^2 \sin 2y,$$ $$u_3(x,y) = \frac{x^2+2y+2}{3x^2+6y+5}$$
where each of these can be written as functions of the variable
$$p=x^2+2y$$
such that
$$u_1(x,y) = \left( x^2+2y \right)^2+4 = f_1(p),$$ $$u_2(x,y) = \sin \left( x^2 + 2y \right) = \sin p = f_2(p)$$ $$u_3(x,y) = \frac{p+2}{3p+5} = f_3(p)$$
Hence,
$$\frac{\partial u_i}{\partial x}=\frac{df_i}{dp}\frac{\partial p}{\partial x} = 2xf'_i$$ $$\frac{\partial u_i}{\partial y}=\frac{df_i}{dp}\frac{\partial p}{\partial y} = 2f'_i$$
where upon elimination of $f'_i$ we get
$$\frac{\partial p}{\partial y}\frac{\partial u_i}{\partial x}=\frac{\partial p}{\partial x}\frac{\partial u_i}{\partial y}.$$
(or, substituting $p=x^2+2y$,
$$\frac{\partial u_i}{\partial x}=x\frac{\partial u_i}{\partial y}.$$)
So, not only are the functions $u_i(x,y)$ solutions to the PDE, but also any arbitrary functions $f(p)$ (for this problem, where $p=x^2+2y$) because $u_i(x,y)=f_i(p)$ (simply flip the roles of $\frac{\partial p}{\partial y}$ and $\frac{\partial u_i}{\partial x}$ in the LHS, and similarly for the right.
I believe what Courant & Hilbert were trying to see is that, even in the most general case (where $x$ and $y$ still appear explicitly in $f$), the equation is at most quasilinear and hence cannot represent all types of PDEs since it cannot represent nonlinear PDEs (here $g$ = $p$). So, in general, an $n$th-order PDE cannot always result from elimination of $n$ arbitrary functions.