Given an $m \times n$ matrix $M$ ($m \geq n$), the nearest semi-orthogonal matrix problem in $m \times n$ matrix $R$ is
$$\begin{array}{ll} \text{minimize} & \| M - R \|_F\\ \text{subject to} & R^T R = I_n\end{array}$$
A solution can be found by using Lagrangian or polar decomposition, and is known to be
$$\hat{R} := M(M^TM)^{-1/2}$$
If $\|\cdot\|_F$ is replaced by the entry-wise $1$-norm
$$\|A\|_1 := \|\operatorname{vec}(A)\|_1 = \sum_{i,j} |A_{i,j}|$$
the problem becomes
$$\boxed{\begin{array}{ll} \text{minimize} & \| M - R \|_1\\ \text{subject to} & R^T R = I_n\end{array}}$$
What do we know about the solutions in this case? Is $\hat{R}$ still a solution? If the solution is something else, do analytic forms or approximations exist? Any insight or direction to literature is appreciated.
Reformulate the problem by constructing the $R$ matrix in terms of an unconstrained matrix $U$ such that the constraint is always satisfied: $$R = U(U^TU)^{-1/2}\quad\implies\quad R^TR=I$$ Then optimize with respect to the new unconstrained variable $U.\;$ Since $R$ is independent of $\,\|U\|\,$ controlling the growth of $\,\|U\|\,$ (via normalization) will be important for numerical algorithms.
The (sub)gradient of the Manhattan norm of $X$ is given by $$\eqalign{ \mu &= \|X\|_1 \\ d\mu &= {\rm sign}(X):dX \;\;\doteq\;\; S:dX \\ }$$ where the colon denotes the trace/Frobenius product, i.e. $$A:B = {\rm Tr}(A^TB)$$ and the sign function $${\rm sign}(z) = \begin{cases} +{\tt1}\quad{\rm if}\;z\ge 0 \\ -{\tt1}\quad{\rm otherwise} \\ \end{cases}$$ is applied elementwise.
Now calculate the differential of $R$ $$\eqalign{ B &= (U^TU)^{1/2} = B^T \qquad&\big\{{\rm square\,root}\big\} \\ B^2 &= U^TU \\ B\,dB+dB\,B &= U^TdU+dU^TU \qquad&\big\{{\rm differentials}\big\} \\ (I_n\otimes B+B\otimes I_n)\,db &= \Big(I_n\otimes U^T+(U^T\otimes I_n)K\Big)\,du \qquad&\big\{{\rm vectorized}\big\} \\ db &= P\,du \\ \\ R &= UB^{-1} \\ dR &= dU\,B^{-1} - UB^{-1}dB\,B^{-1} \\ &= dU\,B^{-1} - R\,dB\,B^{-1} \\ dr &= (B^{-1}\otimes I_m)\,du - (B^{-1}\otimes R)\,db \\ &= \Big((B^{-1}\otimes I_m) - (B^{-1}\otimes R)P\Big)\,du \\ &= Q\,du \\ }$$ where $K$ is the Commutation Matrix associated with the vectorization of a matrix transpose via the Kronecker product.
Finally, let $\,X=(R-M)\,$ and calculate the gradient of the objective function $$\eqalign{ d\mu &= S:dX \\ &= S:dR \\ &= s:dr &\qquad\big\{{\rm vectorized}\big\} \\ &= s:Q\,du \\ &= Q^Ts:du \\ \frac{\partial\mu}{\partial u} &= Q^Ts &\qquad\big\{{\rm gradient}\big\} \\ \\ }$$ Now use a sub-gradient method initialized with the Frobenius solution $$\eqalign{ U_0 &= M \\ }$$ and iterated as
$$\eqalign{ B_k &= (U_k^TU_k)^{1/2} \\ R_k &= U_kB_k^{-1} \qquad\qquad\qquad\big\{{\rm current/best\,solution}\big\} \\ P_k &= (I_n\otimes B_k+B_k\otimes I_n)^{-1}\Big(I_n\otimes U_k^T+(U_k^T\otimes I_n)K\Big) \\ Q_k &= \Big((B_k^{-1}\otimes I_m) - (B_k^{-1}\otimes R_k)P_k\Big) \\ g_k &= Q_k^T\,{\rm vec}\Big({\rm sign}(R_k-M)\Big) \\ w_k &= \operatorname{vec}(U_k) - \lambda_kg_k \qquad\;\big\{{\rm subgradient\,step}\big\} \\ u_{k+1} &= \frac{w_k}{\|w_k\|_2} \qquad\qquad\qquad\big\{{\rm normalize}\big\} \\ U_{k+1} &= {\rm reshape}\big(\,u_{k+1},\, {\rm shape}(U_k)\,\big) \\ }$$