My question is whether $x/x$ is always equal to $1$. I am mostly intersted in real numbers and particularly wonder whether $x/x$ is defined at $x=0$.
On one hand the division should simplify to $1$, on the other hand you should not be allowed to divide by zero.
I have been trying to find whether the simplification 'goes first' or whether the division causes trouble first, but it has proven impossible for me to find usefull search terms.
Note that this question arose after reading this answer and its first comment.
The function $f(x) = \frac{x}{x}$ is defined for all $x \in \mathbb{R}\setminus\{0\}$. Its limits exist from the left and from the right at $0$, but it is not defined at $0$. It doesn't matter if you "simplify" first and then "check" or the other way around, because as you pointed out you're not allowed to divide by $0$. Thus $$f(x) = \begin{cases} 1 & x \neq 0 \\ \text{undefined} & x = 0\end{cases}.$$ Consider the same question, $$g(x) = \frac{x^2 - 2x + 1}{x-1}.$$ What is this function?