1)Determine the matrix that rotates the points in the plane counter-clockwise through an angle of $π/2$ about the origin, and at the same time expands the points to four times the distance from the origin.
2)Is this equivalent to a rotation followed by an expansion? Is this equivalent to an expansion followed by a rotation? Justify this from the general viewpoint. Also explain it from the geometric viewpoint.
Can you please explain me.
My preferred trick for tackling this would be the following:
Let $T: \mathbb{R}^n \rightarrow \mathbb{R}^n$ be a linear transformation represented by the matrix $A$. Let $\{ \mathbf{e}_k \}_{k=1}^n$ be the set of standard basis vectors for $\mathbb{R}^n$, which are all zero except for a $1$ in the $k^\text{th}$ position. $A$ has as its $k^\text{th}$ column vector the image of $\mathbf{e}_k$ under $T$ (**see end of post for further discussion).
For example, consider the linear transformation $\mathbb{R}^2 \rightarrow \mathbb{R}^2$ defined by a $\pi/2$ rotation clockwise about the origin. You can check that $\langle 1,0 \rangle \mapsto \langle 0,-1 \rangle$ and that $\langle 0,1 \rangle \mapsto \langle 1,0 \rangle$. Therefore, the matrix representation for this transformation is $\left[ \begin{array}{ccc} 0 & 1 \\ -1 & 0 \end{array} \right]$. Feel free to multiply $A$ by some "test" vectors to see that doing so really does output the $\pi/2$ clockwise rotations of them.
Given this technique, what will be the matrix that represents the expansion? The rotation composed with the expansion? The expansion composed with the rotation? In particular, as a shortcut, one can show that, if $S, T: \mathbb{R}^n \rightarrow \mathbb{R}^n$ are linear transformations represented by the matrices $A$ and $B$ respectively, then the matrix that represents $T \circ S$ (first applying $S$ then applying $T$) can be found by multiplying the matrices $A$ and $B$.
Indeed, you'll find that expansion composed with rotation is the same as rotation composed with expansion. This is only because we are in a special case where one of the matrices is a multiple of the identity matrix (which commutes with everything). But what of general linear transformations? I encourage you to find examples where $T \circ S \neq S \circ T$.
**The reason this works is as follows:
Any vector $\mathbf{v} = \langle c_1, c_2, \cdots, c_n \rangle$ in $\mathbb{R}^n$ can be written as $\displaystyle \mathbf{v} = \sum_{k=1}^n c_k \mathbf{e}_k$. Now suppose we have a linear transformation $T: \mathbb{R}^n \rightarrow \mathbb{R}^n$ and we want to know where $T$ sends $\mathbf{v}$. We have, due to the properties linear transformations satisfy: $$T( \mathbf{v}) = T \left( \sum_{k=1}^n c_k \mathbf{e}_k \right) = \sum_{k=1}^n c_k T( \mathbf{e}_k)$$
If we construct the matrix $A$ so as to have $T( \mathbf{e}_k)$ as its $k^\text{th}$ column, and we have a vector $\mathbf{v} = \langle c_1, c_2, \cdots, c_n \rangle$, then one can check that $A \mathbf{v} = c_1T( \mathbf{e}_1) + c_2 T( \mathbf{e}_2) + \cdots + c_n T( \mathbf{e}_n)$, as desired.