I want to maximize the smallest nonzero singular value of (non-square) matrix $X$. This is equivalent to maximizing $\lambda_{\min}(X^\top X)$, which can be reformulated as follows
$$\begin{array}{ll} \underset{t, X}{\text{maximize}} & t\\ \text{subject to} & t \, \mathbb{I} - X^\top X \preceq 0\end{array}$$
One can reformulate the last constraint using the Schur complement iff $\mathbb{I}$ is negative semidefinite, which is absurd. So the claim is that the last constraint is non-convex. Is there any other tool to reformulate the last constraint as a convex constraint?
I'm not totally sure if this is answering your question, but let $\boldsymbol{\lambda}$ be a vector that you can think of as containing the eigenvalues of $\mathbf{X}^T\mathbf{X}$. I'm assuming that the Frobenius norm constraint you mentioned in your comment is of the form $||X||_{F}^2 \le c$. Let $\mathbf{1}$ be a vector of all 1s. Assume $\mathbf{X} \in \mathbb{R}^{M\times N}$. I think you should start with the program:
$$\max_{\lambda,t} t$$ $$\text{st. } \mathbf{1}t \preccurlyeq \boldsymbol{\lambda}$$ $$\boldsymbol{\lambda} \succcurlyeq 0$$ $$\mathbf{1}^T\boldsymbol{\lambda} \le c$$
If, by chance, you wanted a constraint of the form $||X||_{F}^2 = c$ you should start with:
$$\max_{\lambda,t} t$$ $$\text{st. } \mathbf{1}t \preccurlyeq \boldsymbol{\lambda}$$ $$\boldsymbol{\lambda} \succcurlyeq 0$$ $$\mathbf{1}^T\boldsymbol{\lambda} = c$$
Both of these are clearly linear programs. You can find an $\mathbf{X}$ that achieves the solution easily enough. First put $\boldsymbol{\lambda}$ into a diagonal matrix, say $\mathbf{S}$ and augment it with the appropriate number of zeros such that it has the same dimensions of $\mathbf{X}$, ie $$ \begin{bmatrix}\lambda_1 &0 &0 \\ 0& \ddots &0 \\ 0& 0 & \lambda_N \\ 0 & 0 & 0\\ \vdots & \vdots & \vdots \end{bmatrix}.$$ Pick your favorite orthogonal matrix $\mathbf{V}\in\mathbb{R}^{M\times M}$ and your second favorite orthogonal matrix $\mathbf{U}\in\mathbb{R}^{N\times N}$. Let $\mathbf{R} = \sqrt{\mathbf{S}}$ (elementwise I guess, idk if the matrix square root is technically allowed for non-square matrices). Let $\mathbf{X} = \mathbf{V}\mathbf{R}\mathbf{U}^T$ then clearly $$\mathbf{X}^T\mathbf{X} = \mathbf{U}\mathbf{S}\mathbf{U}^T.$$ This matrix has the eigenvalues that you want.