How to prove that $f: \mathbb{N}^2 \to \mathbb{N}$ is a bijection?

65 Views Asked by At

I am being asked to prove that the following function $f: \mathbb{N}^2 \to \mathbb{N}$ is a bijection where: $$f(x,y) = \frac{(x+y)(x+y+1)}{2} + y$$ So far I have not had any ideas on how to prove injection, and with regards to surjection I have tried using induction.
We have the base step $f(0,0)= 0$, if we assume that there exist $x,y$ such that $f(x,y)=n$ then we can try to find some non-negative $k,m$ such that $f(x+k,y+m)=n+1$
I tried expanding the expression to find out if I could prove that $m$ and $k$ were non negative integers, but I could extract nothing from the result.
I have not yet tried proving injection, and I do not know if to prove it, I could try to derive a contradiction from $f(a,b)=f(c,d)$ where $a\ne c$ and then separately $b \ne d$.

1

There are 1 best solutions below

0
On BEST ANSWER

Let $$\frac{(x+y)(x+y+1)}{2}+y=n$$ now $\frac{(x+y)(x+y+1)}{2}$ is a triangular number the sum of $1$ to $x+y$. It is the largest triangular number less than or equal to $n$, since the next triangular number is $\frac{(x+y)(x+y+1)}{2}+x+y+1$. Thus $\frac{(x+y)(x+y+1)}{2}$ is determined uniquely by $n$ it follows that $y$ is also uniquely determined, finally $x$ is also unique.