Can I do the following for matrices:
$$ A = \begin{bmatrix} 2 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} == 2^{1/3}\begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} = 2^{1/3}B \quad ??$$
I know I can do the following for matrices:
$$ If \quad A = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} \quad then \quad 2A = \begin{bmatrix} 2 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 2 \end{bmatrix} $$
and following for determinants:
$$ |A| = \begin{vmatrix} 2 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{vmatrix} == 2\begin{vmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{vmatrix} = 2|B| \quad \checkmark\checkmark$$
Nope! $2^{1/3}A$ means you multiply every cell of $A$ by $2^{1/3}$. That's because $2^{1/3}$ is a number ($\sqrt[3]{2}$, in fact), just as 2.
Of course you could come with a specific system of notation which indicates that you will multiply a certain row or column (this should be clear two) by a certain number... I don't know, maybe $$2^{]r1[}\cdot A$$ (I'm just making something up). If you define something like that, and it's not ambiguous, and you state clearly the definition... well you can. But just because you say so.
BTW, I don't remember any notation like that... but who knows.
Regarding determinants... you can do it cause it's a property of determinants. I mean, it's a theorem, it has been proved. To say it briefly, when you calculate a determinant of an $n\times n$ matrix, you end up adding and substracting several terms ($n!$ terms in fact), each one being a product of $n$ elements of the matrix: but they're are always organized in a way such that there is exactly one element of each column and, at the same time, one element of each row.
Then, if you multiply the 2nd column of $A$ by $4$ you won't get the matrix $4A$, but it is true that the determinant of this new matrix will be $4$ times the determinant of A, since the effect of that multiplication was to insert a factor $4$ in each of the $n!$ terms you add and substract to calculate the determinant.
For instance, if $A$ is a $3\times 3$ matrix, then $$\det(A)= a_{11}a_{22}a_{33}+a_{21}a_{32}a_{13}+a_{31}a_{12}a_{23}-a_{13}a_{22}a_{31}-a_{11}a_{23}a_{32}-a_{12}a_{21}a_{33}$$ (check that there are in fact $3!=6$ terms and that for each term every row [first index] and every column [second index] appears once and only once.)
So if you now create matrix $B$ by multiplying by $\lambda$ say the third row of $A$ (this affects $a_{31}$, $a_{32}$ and $a_{33}$), then $$\det(B)= a_{11}a_{22}(\lambda a_{33})+a_{21}(\lambda a_{32})a_{13}+(\lambda a_{31})a_{12}a_{23}-a_{13}a_{22}(\lambda a_{31})-a_{11}a_{23}(\lambda a_{32})-a_{12}a_{21}(\lambda a_{33})=\lambda(a_{11}a_{22}a_{33}+a_{21}a_{32}a_{13}+a_{31}a_{12}a_{23}-a_{13}a_{22}a_{31}-a_{11}a_{23}a_{32}-a_{12}a_{21}a_{33})=\lambda \det(A).$$