An old advanced calculus exam question:
Let $f$ and $g$ be differentiable functions with $f(1)=g(1)=0$. Under what circumstances can one solve the equations $$f(xy)+g(yz)=0$$ $$g(xy)+f(yz)=0$$ for $y$ and $z$ as functions of $x$, near the point $x=y=z=1$?
If the solutions have Taylor series of the form $$y(x)=1+a(x−1)+…$$ $$z(x)=1+b(x−1)+…$$ near $x=1$, determine $a$ and $b$.
I've been reviewing the Implicit Function Theorem for awhile now (on the Wikipedia page) but am still not sure how to get started. I have to somehow make use of the assumption $f(1)=g(1)=0$.
Any hints or suggestions are welcome. Perhaps hold off on offering a full solution, since I think this may be a short problem, once one figures out the trick...and can then proceed to apply the IFT as usual.
Thanks,
Well, a hint: put $F(x,y,z) = f(xy) + g(yz)$ and $G(x,y,z) = g(xy) + f(yz)$, to analyze: $$\begin{cases} F(x,y,z)=0 \\ G(x,y,z) = 0 \end{cases}$$Now look at: $$\frac{\partial(F,G)}{\partial(y,z)}(1,1).$$You'll need to use the chain rule to differentiate $F$ and $G$, and the condition $f(1) = g(1) = 0$ will be used to evaluate the above determinant.