Let $M_3$ be the $\mathbb{C}$- vector space of matrices of order 3 with complex entries. What is the dimension of subspace $V$ of $M_3$ satisfying the following both conditions?
(a) $AB=BA$ if $A,B\in V$.
(b) If $W$ is a subspace of $M_3$ satisfying $W\supset V$ and $W \neq V$, then there exist $A,B\in W$ such that $AB\neq BA$.
Let $V$ be a space with the given properties.
If the dimension of $V$ is zero, we have immediately a contradiction, let $W$ be the linear space generated by $I$, say.
In fact, without loss of generality, we can already assume (and do this) that $V$ contains as a subspace the vector space $\Bbb C I$, and we always chose a basis starting with $I$.
If the dimension of $V$ is one, assumed with base $I$, then pass to $W=\Bbb D$, where $\Bbb D$ is here (and in the sequel) the space of diagonal matrices. Any two matrices in $\Bbb D$ commute. Contradiction.
If the dimension of $V$ is two, assumed with base $I,A$, then $A$ has a Jordan normal form, using the base change matrix, we can assume that $A$ is already in Jordan block form. If $A$ is diagonal, then take $W=\Bbb D$ again, contradiction. Else consider the space generated by $I$, $A$, and $A^2$. It is clear that any two matrices in this space commute, its vector space dimension is three, because $A^2$ has a $1$ in the "next diagonal" (or corner) of the non-trivial block of $A$. Contradiction.
If the dimension of $V$ is three we declare ourselves happy with the situation. (Such a space exists, for instance $\Bbb D$.)
If the dimension of $V$ is four, we split in cases. If any matrix in $V$ is diagonalizable, we chose four of them, a basis, then there exists a simultaneous base change (which is a linear from $M_3$ to $M_3$) that makes them diagonal, so $V$ lands isomorphically in $\Bbb D$, a space of dimension three, contradiction. So $V$ also contains a non-diagonalizable matrix, $A$, say. After a base change, we can and do assume $A$ is already in Jordan block format.
We have some cases, depending on its block formation.
First, let us consider the case of a single block, $$A= \begin{bmatrix} \lambda & 1 & \\ & \lambda &1 \\ & & \lambda \end{bmatrix} \ . $$ (Missing entries are zero.) Since $I$ is in $V$, we consider instead of $A$ the matrix $A-\lambda I$, so we can reduce ourselves to the case $$A= \begin{bmatrix} 0 & 1 & \\ & 0 &1 \\ & & 0 \end{bmatrix} \ . $$ Then $I,A,A^2$ are already linearly independent, commuting, thus in $V$. (Else extend (with $A^2$, and any matrix commuting with $A$ (in $V$) also commutes with $A^2$).) Which are the matrices $X$ commuting with $A$? Let $X$ be such a matrix, we compute: $$ \begin{aligned} \begin{bmatrix} x_{11} & x_{12} & x_{13} \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \end{bmatrix} \begin{bmatrix} 0 & 1 & \\ & 0 &1 \\ & & 0 \end{bmatrix} &= \begin{bmatrix} 0 & x_{11} & x_{12} \\ 0 & x_{21} & x_{22} \\ 0 & x_{31} & x_{32} \end{bmatrix} \ , \\ \begin{bmatrix} 0 & 1 & \\ & 0 &1 \\ & & 0 \end{bmatrix} \begin{bmatrix} x_{11} & x_{12} & x_{13} \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \end{bmatrix} &= \begin{bmatrix} x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \\ 0 & 0 & 0 \end{bmatrix} \ , \end{aligned} $$ (we have shifted columns to the right, respectively rows downwards,) so the lower diagonal entries are zero, $x_{21}=x_{31}=x_{32}=0$, and the diagonal entries coincide, $x_{11}=x_{22}=x_{33}$, and also on the "next diagonal" $x_{12}=x_{23}$. So any such $X$ is already in the space generated by $I, A, A^2$, contradiction with the fourth dimension supposed to be in $V$.
Next case, let us consider the case of a $2\times 2$ block, an $1\times 1$ block, with different eigenvalues,
$$A= \begin{bmatrix} \lambda & 1 & \\ & \lambda &1 \\ & & \lambda' \end{bmatrix} \ ,\ \lambda\ne \lambda'\ . $$ Since $I$ is in $V$, we consider instead of $A$ the matrix $A-\lambda I$, so we can reduce ourselves to the case $$A= \begin{bmatrix} 0 & 1 & \\ & 0 & \\ & & \mu \end{bmatrix} \ ,\ \mu=\lambda'-\lambda\ne 0\ . $$ Then $I,A,A^2$ are already linearly independent, commuting, thus in $V$. (Else extend...) The matrix $A^2$ has only one non-zero entry at position $(3,3)$, using it, we can linearly clean the corresponding entry in $A$, so $V$ contains the matrices $$ I\ ,\ B= \begin{bmatrix} 0 & 1 & \\ & 0 & \\ & & 0 \end{bmatrix} \ , C= \begin{bmatrix} 0 & & \\ & 0 & \\ & & 1 \end{bmatrix} \ . $$ A matrix $X$ commuting with $C$ is of the shape $$ \begin{bmatrix} * & * & \\ * & * & \\ & & * \end{bmatrix} $$ and the condition that the left upper $2\times 2$ block commutes with $B$ shows (in a similar computation as above) that $X$ is in fact a linear combination of $, B, C$. We miss again the chance to go in the fourth direction.
Final (longer) case, where we have less than in the last case, let us consider the case of a $2\times 2$ block, an $1\times 1$ block, with equal eigenvalues,
$$A= \begin{bmatrix} \lambda & 1 & \\ & \lambda & \\ & & \lambda \end{bmatrix} \ . $$ Since $I$ is in $V$, we consider instead of $A$ the matrix $$ B= \begin{bmatrix} 0 & 1 & \\ & 0 & \\ & & 0 \end{bmatrix} \ . $$ From $$ \begin{aligned} \begin{bmatrix} x_{11} & x_{12} & x_{13} \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \end{bmatrix} \begin{bmatrix} 0 & 1 & \\ & 0 & \\ & & 0 \end{bmatrix} &= \begin{bmatrix} 0 & x_{11} & 0 \\ 0 & x_{21} & 0 \\ 0 & x_{31} & 0 \end{bmatrix} \ , \\ \begin{bmatrix} 0 & 1 & \\ & 0 & \\ & & 0 \end{bmatrix} \begin{bmatrix} x_{11} & x_{12} & x_{13} \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \end{bmatrix} &= \begin{bmatrix} x_{21} & x_{22} & x_{23} \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \ , \end{aligned} $$ we get - after replacing $X$ by $X-x_{11}I$ that $X$ is in the space of matrices of the shape $$ \begin{bmatrix} 0 & * & * \\ 0 & 0 & 0 \\ 0 & * & * \end{bmatrix} \ . $$ Using $B$ we find a two dimensional subspace of $V$ inside $$ \begin{bmatrix} 0 & 0 & * \\ 0 & 0 & 0 \\ 0 & * & * \end{bmatrix} \ . $$ Then, using this two-dimensionality, there exist a matrix $F\ne 0$ in $V$ of the shape $$ \begin{bmatrix} 0 & 0 & * \\ 0 & 0 & 0 \\ 0 & \boxed{0} & * \end{bmatrix} \ . $$ If the upper right corner in $F$ can be taken to be $0$, then $F=C$ from the last case, and we get the contradiction from there. Else, we can take $F$ of the form $$ \begin{bmatrix} 0 & 0 & \boxed 1 \\ 0 & 0 & 0 \\ 0 & \boxed{0} & a \end{bmatrix} \ . $$ Again we are searching for matrices commuting with the above $F$, $$ \begin{aligned} \begin{bmatrix} x_{11} & x_{12} & x_{13} \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \end{bmatrix} \begin{bmatrix} 0 & & 1\\ & 0 & \\ & & a \end{bmatrix} &= \begin{bmatrix} 0 & 0 & x_{11} + ax_{13} \\ 0 & 0 & x_{21} + ax_{23} \\ 0 & 0 & x_{31} + ax_{33} \\ \end{bmatrix} \ , \\ \begin{bmatrix} 0 & & 1\\ & 0 & \\ & & a \end{bmatrix} \begin{bmatrix} x_{11} & x_{12} & x_{13} \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \end{bmatrix} &= \begin{bmatrix} x_{31} & x_{32} & x_{33} \\ 0 & 0 & 0 \\ ax_{31} & ax_{32} & ax_{33} \end{bmatrix} \ , \end{aligned} $$ and to simplify things, using $I$ (already in $V$), we replace $X$ above by $X-x_{11}I$, so assume $x_{11}=0$, then using $B$ (already in $V$), we replace $X$ above by $X-x_{12}B$, so assume $x_{12}=0$, and then using $F$ (already in $V$), we replace $X$ above by $X-x_{13}B$, so assume $x_{13}=0$. (This is better form me while typing.) So we have instead the situation: $$ \begin{aligned} \begin{bmatrix} 0 & 0 & 0 \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \end{bmatrix} \begin{bmatrix} 0 & & 1\\ & 0 & \\ & & a \end{bmatrix} &= \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & x_{21} + ax_{23} \\ 0 & 0 & x_{31} + ax_{33} \\ \end{bmatrix} \ , \\ \begin{bmatrix} 0 & & 1\\ & 0 & \\ & & a \end{bmatrix} \begin{bmatrix} 0 & 0 & 0 \\ x_{21} & x_{22} & x_{23} \\ x_{31} & x_{32} & x_{33} \end{bmatrix} &= \begin{bmatrix} x_{31} & x_{32} & x_{33} \\ 0 & 0 & 0 \\ ax_{31} & ax_{32} & ax_{33} \end{bmatrix} \ , \end{aligned} $$ This implies that the third row is zero. The second row is zero because of the condition to commute with $B$, see the pattern above. So we do not have again place for a fourth dimension in $V$.
$\blacksquare$
(Note: I tried to avoid notions like (maximal) torus, parabolic subgroup, etc. - please search the net for them, if the problem was a beginning of studying algebraic linear groups.)