Melvin Schwartz starts Principles of Electrodynamics by matrix of rotations, on page 4:

Wikipedia says something similar but less thoroughly, so I will not discuss it.
Basically, page says that we can translate basis vectors
$$\begin{bmatrix}i\\j\end{bmatrix} = \begin{bmatrix} a_{11} i' + a_{21} j' \\a_{12} i' + a_{22} j'\end{bmatrix}$$
and, it is obvious that
$i\cdot i' = a_{11}$
$i\cdot j' = a_{21}$
$j\cdot i' = a_{12}$
$ j \cdot j' = a_{22}$
Why obvious? where does this come from?
Ok, let me exploit this fact and replace the matrix A coefficients with these elementary vector products:
$$\begin{bmatrix}i\\j\end{bmatrix} = \begin{bmatrix} a_{11} i' + a_{21} j' \\a_{12} i' + a_{22} j'\end{bmatrix} = \begin{bmatrix} a_{11} & a_{21} \\a_{12} & a_{22} \end{bmatrix} \ \ \begin{bmatrix} i' \\ j' \end{bmatrix} = \begin{bmatrix} i\cdot i' & i\cdot j' \\j\cdot i' & j \cdot j' \end{bmatrix} \ \begin{bmatrix} i' \\ j' \end{bmatrix} = \begin{bmatrix}i\\j\end{bmatrix} \ \begin{bmatrix}i' & j'\end{bmatrix} \ \begin{bmatrix} i' \\ j' \end{bmatrix} = A \ \begin{bmatrix} i' \\ j' \end{bmatrix} $$
This looks very nice, with one problem when I do a check
$$\begin{bmatrix}i\\j\end{bmatrix} = \begin{bmatrix}i\\j\end{bmatrix} \ \begin{bmatrix}i' & j'\end{bmatrix} \ \begin{bmatrix} i' \\ j' \end{bmatrix} = \begin{bmatrix}i\\j\end{bmatrix} \ \begin{bmatrix}i'\cdot i' + j' \cdot j'\end{bmatrix} = \begin{bmatrix}i\\j\end{bmatrix} \ \begin{bmatrix}1+1\end{bmatrix} = \mathbf{2} \ \begin{bmatrix}i\\j\end{bmatrix} \neq \begin{bmatrix}i\\j\end{bmatrix} $$
Do you see that factor of two has appeared? Where is the mistake?
Your notation makes it look as if $i$, $j$ etc. were numbers. It's clear from the book's notation that they are in fact vectors. Thus your manipulations treating them as numbers are faulty. You can use the sort of notation that you're using if you know what you're doing; but then you have to keep track of which vectors are row vectors and which are column vectors. If you do that, you find that where you have $\hat i'\cdot\hat i'+\hat j'\cdot\hat j'$ you should have $\hat i'\hat i'^\top+\hat j'\hat j'^\top$, the sum of the projectors onto the two orthonormal vectors, and that sum is indeed the identity as required.
The equations $\hat i\cdot\hat i'=a_{11}$ etc. are obtained by multiplying equations (1-2-1) through by $\hat i$, $\hat j$ and $\hat k$ and using the orthonormality of $\hat i'$, $\hat j'$ and $\hat k'$.
[Edit following up on the exchange in the comments:]
Here's a complete rendition of a working version of your "check":
\begin{align} \def\i{\hat i} \def\j{\hat j} \pmatrix{\i&\j} &= \pmatrix{a_{11}\i'+a_{21}\j'&a_{12}\i'+a_{22}\j'} \\ &= \pmatrix {\i'&\j'}\pmatrix{a_{11}&a_{12}\\a_{21}&a_{22}} \\ &= \pmatrix {\i'&\j'}\pmatrix{\i'^\top\i&\i'^\top\j\\\j'^\top\i&\j'^\top\j} \\ &= \pmatrix {\i'&\j'}\left[\pmatrix{\i'^\top\\\j'^\top}\pmatrix{\i&\j}\right] \\ &= \left[\pmatrix {\i'&\j'}\pmatrix{\i'^\top\\\j'^\top}\right]\pmatrix{\i&\j} \\ &= \left[\i'\i^\top+\j'\j'^\top\right]\pmatrix{\i&\j} \\ &=\pmatrix{\i&\j}\;. \end{align}
You can also transpose the whole thing, but then you have to make sure to transpose everything on both levels, both the vectors $\i$ etc. and the vectors containing them. The problem with your version is that it's a mixture of the two, where you transposed the outer vectors but not the vectors $\i$ etc., and then the multiplications aren't well-defined.