Intuition behind Isomorphic spaces "Being the Same"

1k Views Asked by At

I know that isomorphic spaces are treated as the same. But why is it so....

Like $R^2$ and the set of all ${(x, y, 0) }$ are isomorphic but the "same" vectors in the two spaces are actually different vectors.

Some isomorphic spaces might be having even different rules of vector addition and scalar multiplication, then why the corresponding vectors in both will be the same.

Also any N dimensional vector space $V$ is isomorphic to $F^n$. Buth that n dimensional vector space can be a space of matrices or of polynomials or of any other abstract vectors. How does saying corresponding vectors in each such n dimensional vector spaces are "the same" as the the n tuple in $F^n$.

All these vectors have different rules for multiplication and addition, then what is the intuitive reasoning behind them being treated as same. Will it not defeat the purpose of treating abstract objects as vectors.

Edit:

Precisely this

An n dimensional polynomial space is isomorphic to $F^n$. An n dimensional space of matrices ( n= ab) is isomorphic to $F^n$. Now how is Differentiation in n-dimensional polynomial space mirrored in $F^n$ ( n- tuple are constants) and How is a transpose operation in n dimensional matrix space mirrored to $F^n$. Also since the n dimensional space and n dimensional matrix space are isomorphic to $F^n$ , then they should be isomorphic to each other too ( is this correct). But then how is differentiation in n dimensional polynomial space mirrored to an n dimensional matrix space.

6

There are 6 best solutions below

8
On BEST ANSWER

You are asking a good question.

Take this statement: For a field $F$, the following vector spaces are isomorphic:

  • $F^{n^2}$
  • The space $M_{n \times n}$ of $n \times n$ matrices over $F$
  • The space $P_{n^2}$ of polynomials with degree less than $n^2$ with coefficients in $F$

The isomorphisms we are talking about in this example only concern the additive structure, the "+", and the scalar multiplication (multiplication with an element in $F$). If we are only allowed to do addition and multiplication with a scalar, then the two spaces behave exactly the same. But you are right that both spaces may allow us to do other things that you cannot naturally do in the respective other space.

But we can always define them in the other space! This is done in general as follows. Take your isomorphism $\phi$, for example $\phi \colon M_{n \times n} \to P_{n^2}$. In $P_{n^2}$ we have differentiation, given by a map $D \colon P_{n^2} \to P_{n^2}$. How can we define differentiation in our matrix space? There is only one way if we want our new definition to be isomorphic to the definition on $P_{n^2}$. We have to define our new differentiation on matrices as $D_M := \phi^{-1} \circ D \circ \phi$. In other words: $$ D_M \colon M_{n \times n} \to M_{n \times n} \\ m \mapsto \phi^{-1}(D(\phi(m))) $$

For example, let's "differentiate" the matrix $$ \pmatrix{ 1 & 2 \\ 3 & 4 } $$

As a polynomial, this is $f(x) = x^3 + 2 x^2 + 3x + 4$ (depends on your choice of $\phi$!). So the derivative is $f'(x) = 0x^3 + 3x^2 + 4x + 3$. As a matrix, this is $$ \pmatrix{ 0 & 3 \\ 4 & 3 } $$

This is your "derivative" of the matrix.

3
On

Because "isomorphic" literally means "same structure", so thats because two isomorphic spaces are treated as the same. If you think about it, an isomorphism is a bijection with special conditions between the operations in the two different spaces. This basically means that if two spaces are isomorphic, their structure will be the same because the operations work in the same way. In other words, two isomorphic spaces are two different representations of the same structure.

5
On

Let $V$ be the set of real polynomials of degree at most $1$. Then $V$ is isomorphic to $\mathbb R^2$ under the isomorphism $\phi:ax+b \mapsto (a,b)$.

Does this imply that the elements of $V$ are the same as the elements of $\mathbb R^2$?

Clearly not: $V$ contains functions, $\mathbb R^2$ contains points.

Does this imply that the elements of $V$ behave exactly in the same way as the elements of $\mathbb R^2$?

Yes, their linear properties are the same in the sense that each linear operation in $V$ is mirrored in $\mathbb R^2$ via $\phi$. But not all properties are mirrored: Every non-constant polynomial of degree $1$ has a real zero. This sentence doesn't even make sense in $\mathbb R^2$. But then this sentence is not about linear properties of functions.

2
On

Finite dimensional vector spaces are an unusually bad example of a category for learning the lesson that "isomorphic things are the same," because

1) the only isomorphism-invariant of a finite-dimensional vector space is the dimension;

2) in this category there are many examples where two objects are isomorphic, but not canonically, so that really it is not wise to think of them as "the same" without a little extra caution.

Nevertheless, one way to think about this statement is: imagine a sentence that you can write in a formal language, using only $\forall$, $\exists$, 'and', 'or', 'not', and symbols for vector spaces (scalar multiplication, addition, zero, etc.), for example $$ \exists w \in V \ \forall v \in V \ \ \exists c \in \mathbb{R} \ (v + cw = 0). $$

("There is some $w$ in $V$ such that for all $v$ in $V$ there is some scalar $c$ such that $v + cw = 0$.")

Then your sentence will be true in a vector space $V$ if and only if it is true in every vector space that is isomorphic to $V$. (For example, this sentence is true in the zero and one-dimensional vector spaces only). In other words, the truth of all first order sentences is preserved under isomorphism.

0
On

Consider another case: Integer arithmetic. If humans do it, they typically write the integers in the form of decimal digit strings with an optional sign in front; those digits themselves being patterns drawn on a surface like paper. When computers do it, they represent the integers in binary, and the digits are really different charge states of capacitors somewhere in the computer.

Now decimal digits are something different than binary digits, and patterns drawn on paper definitely are something very different than charge states of capacitors. And yet both the human and the computer will come to the result that multiplying $6$ by $7$ gives $42$. That is, although the differences are vast, they are not relevant for the question of arithmetic (they are of course relevant for other questions, for example if the result will survive a power outage). That is, as far as arithmetic goes, those capacitor states are isomorphic to the patterns drawn on paper.

The same is true for isomorphic vector spaces: As long as you only care about their vector space properties, you don't need to care about whether you have pairs of real numbers, a single complex number, a real function of the form $x\mapsto ax+b$, a translation in the Euclidean plane, or whatever other isomorphic vector space you have. You will always get the very same results.

For example, you will in all cases alike find that you need exactly two basis vectors to span the whole space. And importantly, if you figure out any property in one of the spaces, and it is a property that only refers to the vector space structure, then you will immediately know that it will be exactly the same in all the other isomorphic vector spaces. Just like in the arithmetic example, knowing that in the computer's capacitor-charge representation $6\times 7=42$ means that you also know that if you use the symbols-on-paper representation to work it out, you'll come to the exact same result. Even though in the computer, the $42$ will be represented by the binary digit string $101010$ (or a corresponding pattern of three charged and three uncharged capacitors), and on your paper the same number will be represented by a pattern of lines representing the digit 4 followed by the digit 2.

8
On

Suppose that $\mathbb R^3$ throws a masquerade ball. Everyone puts on a costume, so $(2,3,5)$ looks like $2 + 3x + 5x^2$. Everyone looks different, but secretly everything is the same. It's still the same people and the same relationships. Previously we would say that $(2,3,5) + (1,2,3) = (3,5,8)$. Now, dressed up in costumes, we say that $2 + 3x + 5x^2 + 1 + 2x + 3x^2 = 3 + 5x + 8x^2$. But once you know how to take off the costumes, you see that nothing has changed.

An isomorphism tells you how to take the masks off, revealing that everything is the same.