Proving two expressions can never be simultaneously satisfied

125 Views Asked by At

I have a very simple problem, the technique of which I want to apply to more difficult problems. Here is the simple example version:

Suppose we have four functions: $f_1(x)=\sin(x)$, $g_1(x)=0$, and $f_2(x)=\cos(x)$, $g_2(x)=0$. I want to show that there does not exist an $x\in\mathbb{R}$ for which $f_1(x)=g_1(x)$ and $f_2(x)=g_2(x)$ simultaneously. I proceed like this:

For a contradiction, suppose there exists an $x\in\mathbb{R}$ such that $f_1(x)=g_1(x)$ and $f_2(x)=g_2(x)$. Then multiplying the first expression by $\sin(x)$ and the second by $\cos(x)$ gives $$\sin^2(x)=0$$ and $$\cos^2(x)=0.$$ Now add the two above expressions to obtain $$\sin^2(x)+\cos^2(x)=0,$$ from which we conclude that $$1=0.$$ Since this is absurd then we have shown that no such $x\in\mathbb{R}$ exists and our proof is complete.

I want to be able to apply this method to more complicated functions $f_i\neq 0$ and $g_i\neq 0$. My question is this: (1) is the above technique correct for the simple example, and (2) what general tools might I use to solve such problems? Simultaneous equations spring to mind for linear $f$ and $g$, for example.

2

There are 2 best solutions below

3
On BEST ANSWER

Your procedure was correct, and in a sense general. We have (in the reals) $f_1(x)=g_1(x)$ and $f_2(x)=g_2(x)$ for some $x$ if and only if there is an $x$ such that $H(x)=0$, where $$H(x)=(f_1(x)-g_1(x))^2+(f_2(x)-g_2(x))^2.$$ The same applies to any finite number of equations of the form $f_i(x)=g_i(x)$. This is because in general $a_1^2+a_2^2+\cdots+a_n^2=0$ iff all the $a_i$ are $0$.

Unfortunately, that leaves the problem of determining whether a possibly messy function $H(x)$ can ever be $0$.

And in certain cases at least, the strategy is not best possible. For example, if you have a set of simultaneous linear equations (in several variables), then the usual techniques of Linear Algebra can efficiently determine whether the system has a solution. Summing squares would be messy and unhelpful.

For a one variable but artificial example, consider the system $x^{66}+x^2+1=0$, $x^{17}-41x^{16}-2012x^{15}-3x-1=0$. It is clear if we glance at the system that there is no real solution, while summing squares and simplifying will leave us with an equation $H(x)=0$ that is not pleasant to analyze.

0
On

I think trying to prove that there are no common solutions to functions of any type might be a question that is too general.

In Regards to the second question:

You must have learned about solving linear equations, so maybe the next step is learning some techniques used to solved common zeros of polynomials. These can be done using a some pretty interesting tools like Groebner basis.

I reasonably accessible book is "Ideals, Varieties and Algorithms" by Cox, Little and O'shea.