Given a set of vectors $ {\left\{ \boldsymbol{y}_{j} \right\}}_{j = 1}^{n} $ where $ \boldsymbol{y}_{j} \in \mathbb{R}^{m} $ find the optimal vector $ \boldsymbol{x} \in \mathbb{R}^{m} $ of the following convex optimization problem:
$$ \arg \min_{\boldsymbol{x}} \sum_{j = 1}^{n} {\left\| \boldsymbol{x} - \boldsymbol{y}_{j} \right\|}_{1} $$
If one concatenate all the vectors $ {\left\{ \boldsymbol{y}_{j} \right\}}_{j = 1}^{n} $ as the columns of the matrix $ Y $ one could write the equivalent problem:
$$ \arg \min_{\boldsymbol{x}} {\left\| \boldsymbol{x} \boldsymbol{1}_{n}^{T} - Y \right\|}_{1, 1} $$
Where $ {\left\| \cdot \right\|}_{1, 1} $ is the Entry Wise Matrix Norm.
Taking the derivative (Sub Gradient) of $$ {\left\| \boldsymbol{x} \boldsymbol{1}_{n}^{T} - Y \right\|}_{1, 1} $$ will yield:
$$ \frac{\partial }{\partial \boldsymbol{x}} {\left\| \boldsymbol{x} \boldsymbol{1}_{n}^{T} - Y \right\|}_{1, 1} = \operatorname{sign} \left( \boldsymbol{x} \boldsymbol{1}_{n}^{T} - Y \right) \boldsymbol{1}_{n} $$
Where $ \operatorname{sign} \left( \cdot \right) $ is the element wise Sign Function.
Since it is a convex problem the minimum must be a stationary point where the Sub Gradient vanishes. In order to do that (In a similar manner to The Median Minimizes the Sum of Absolute Deviations (The L1 Norm)) the value of $ {x}_{i} $ must be the median of the values $ \left\{ {y}_{i, 1}, {y}_{i, 2}, \ldots, {y}_{i, n} \right\} $.
It is nice to see that in this case solving the Median for each row of $ Y $ will yield the optimal solution (Which is not unique!).
A MATLAB code, including validation using CVX, can be found in my StackExchange Mathematics Q3566493 GitHub Repository.