Let \begin{equation} \mathbf{y}=\mathbf{X}\mathbf{\beta}+\mathbf{u} \end{equation} where
$\mathbf{y}=\begin{bmatrix}y_1 \\ \vdots \\ y_n\end{bmatrix}$,
$\mathbf{X}=\begin{bmatrix}X_{11} & \cdots & X_{1k} \\ \vdots & \ddots & \vdots \\ X_{n1} & \cdots & X_{nk}\end{bmatrix}$, or also written $\mathbf{X}=\begin{bmatrix}\mathbf{x}_1 \\ \vdots \\ \mathbf{x}_n\end{bmatrix}$ (where each $\mathbf{x}_i$ is a $k \times 1$ vector),
$\mathbf{\beta}=\begin{bmatrix}\beta_1 \\ \vdots \\ \beta_k\end{bmatrix}$, and
$\mathbf{u}=\begin{bmatrix}u_1 \\ \vdots \\ u_n\end{bmatrix}$.
Suppose that $E(\mathbf{u}|\mathbf{X})=\mathbf{0}$. It is clear this implies that $E(\mathbf{u})=\mathbf{0}$ since by the Law of Iterated Expectations, \begin{equation} E\left(\mathbf{u}\right)=E\left[E\left(\mathbf{u}\middle\vert\mathbf{X}\right)\right]=E\left(\mathbf{0}\right)=\mathbf{0}. \end{equation}
However, I want to show a similar result for the covariances, i.e. show that $E(\mathbf{u}|\mathbf{X})=\mathbf{0}$ implies $Cov(\mathbf{x},\mathbf{u})=0$ for all $\mathbf{x}$, where $\mathbf{x}$ is a column of the matrix $\mathbf{X}$. My attempt:
Let $\mathbf{x}$ denote a column of the matrix $\mathbf{X}$. Then,
\begin{align} Cov\left(\mathbf{u},\mathbf{x}\right)&=E\left[\left(\mathbf{u}-E(\mathbf{u})\right)\left(\mathbf{x}-E(\mathbf{x})\right)^{\top}\right] \\ &=E\left(\mathbf{u}\mathbf{x}^{\top}\right)-E\left(\mathbf{u}\right)E\left(\mathbf{x}^{\top}\right) \\ &=E\left(\mathbf{u}\mathbf{x}^{\top}\right) \\ &=E\left[E\left(\mathbf{u}\mathbf{x}^{\top}|\mathbf{X}\right)\right] \\ &=E\left[E\left(\mathbf{u}|\mathbf{X}\right)\mathbf{x}^{\top}\right] \\ &=E\left[\mathbf{0}\cdot\mathbf{x}^{\top}\right] \\ &=\mathbf{0}_{n\times n} \end{align}
Is this argument correct?
Since $E(u) = 0$, $Cov(x,u) = E(xu')=E(E(xu'|x))$. Now, $E(x_i u_j|x)=x_i$, $E(u_j|x)=0$. Hence, the matrix $E(xu'|x) = 0$, so $Cov(x,u)=0$. Hence, each column of X is uncorrelated with u.