I'm a little confused as to how to commute (specifically that induced by the transpose in Hermitian conjugation) matrices when one contains operators and is pre-multiplied with the other.
For example, consider $$ \left( \begin{pmatrix} \partial_x & ... \\ \vdots & \end{pmatrix} \begin{pmatrix} \psi_1 \\ \vdots \end{pmatrix}\right)^T $$ When expanding inside the brackets, we get operations like $\partial_x \psi_1$.
How now do we express the result of the transpose?
Thinking solely of matrices, I'd expect
$$ \begin{pmatrix} \psi_1 & \dots \end{pmatrix} \begin{pmatrix} \partial_x & ... \\ \vdots & \end{pmatrix}^T $$
but now, upon expansion, it appears as if there are no operations (we find $\psi_1 \partial_x$).
To be consistant with the result when expanding within the transpose first, we know these operators still actually do operate on the elements in the $\vec{\psi}$ vector. I've heard that in this scenario, you can indicate the direction of operation with an arrow above the operator: $$ \begin{pmatrix} \psi_1 & \dots \end{pmatrix} \begin{pmatrix} \overleftarrow{\partial_x} & ... \\ \vdots & \end{pmatrix}^T $$ but it just sits really poorly with me.
I feel as if there's some more elegant way to express this which doesn't require introducing extra notation, to be gleamed by thinking about matrices of operators themselves and how they act on matrices of scalars.
So; can the "operator matrix" be somehow restored to pre-multiplication with the $\vec{\psi}$ matrix after the transpose is applied?