I'm trying to conceptually understand continuity in $\Bbb R^n$. I understand the one-dimensional case where for a function $f: \Bbb R \to \Bbb R$, arbitrarily small deviations of the input ($\delta$) will produce arbitrarily small deviations in the output ($\epsilon$). However, this picture is muddied when I start to consider functions of multiple variables. For example, what does a continuous function $g: \Bbb R^2 \to \Bbb R^2$ look like? Is it just piecewise continuous? For example, if I have a function $g(x,y) = (x+y, 2xy)$, would I just need to consider that an arbitrarily small change in $x$ and $y$, and show that each of these individually would produce an arbitrarily small change in the output?
More generally, is there a way that I can intuitively think about it? The picture for $\Bbb R$ is clear, but I think it is less clear for higher dimensions. Also, how should I picture any general function $f:\Bbb R^2 \to \Bbb R^2$?
It is instructive to reformulate the definition for continuity for functions $\mathbb{R} \rightarrow \mathbb{R}$. To this end, we first define some notation. Let $x \in \mathbb{R}^{n}$ be a real number and let $\epsilon \in \mathbb{R}_{>0}$ be a positive real number. Then we write \begin{equation*} B(x,\epsilon) = \{ y \in \mathbb{R}^{n}: \|x-y\| < \epsilon \} \end{equation*} for the open ball around $x$ with radius $\epsilon$.
In terms of open balls, the definition for continuity reads as follows.
Let $f: \mathbb{R} \rightarrow \mathbb{R}$. Then $f$ is continuous in $x$ if for every $\epsilon > 0$, there exists a $\delta > 0$ such that for all $y \in B(x, \delta)$ it follows that $f(y) \in B(f(x), \epsilon)$.
In $\mathbb{R}$, the condition $y \in B(x, \delta)$ means precisely that $|x-y|< \delta$ and the condition $f(y) \in B(f(x), \epsilon)$ means precisely that $|f(x) - f(y)| < \epsilon$.
The definition in terms of open balls generalizes readily to functions $f: \mathbb{R}^{n} \rightarrow \mathbb{R}^{m}$.
The idea is as follows. If $f: \mathbb{R}^{n} \rightarrow \mathbb{R}^{m}$ is continuous, then it should not make any sudden jumps, that is, points $x, y \in \mathbb{R}^{n}$ that are close to each other, should get sent to points $f(x),f(y)\in \mathbb{R}^{m}$ that are close to each other.
We formalize this idea by saying that the ball $B(x,\delta)$ around $x$, should get sent to a set of points contained in the ball $B(f(x), \epsilon)$.