I understand that a Low-Rank Matrix Completion (LRMC) problem of the form:
$$ \min_{\mathbf{X}} \mathrm{rank}(\mathbf{X}) \\ \text{s.t.:} \quad \mathbf{X}_{i,j} = \mathbf{M}_{i,j} \quad \forall (i,j) \in \Omega, $$
where $\Omega$ is the set of indices of observed entries, can be posed as a Semidefinite Programming (SDP): $$ \min_{\mathbf{Y}} \mathrm{tr}(\mathbf{Y}) \\ \text{s.t.:} \quad \langle \mathbf{Y}, \mathbf{A}_k \rangle = b_k, \quad \forall k \in [1, |\Omega|], \\ \mathbf{Y} \succeq 0 $$
where $\mathbf{Y} = \begin{bmatrix} \mathbf{S}_1 & \mathbf{X} \\ \mathbf{X}^{\top} & \mathbf{S}_2 \end{bmatrix} \in \mathbb{R}^{(n_1 + n_2) \times (n_1 + n_2)}$ and $\mathbf{A}_k$ and $b_k$ are suitable sampling matrices and observed entries respectively.
However, if in the problem formulation, it is guaranteed that the matrix we are searching for ($\mathbf{X}$) necessarily:
admits a solution such that $\mathrm{rank}(\mathbf{X}) = 1$,
$\mathbf{X}$ is square and symmetric matrix,
$\mathbf{X}$ can be expressed as $\mathbf{X} = \mathbf{v}^{\top}\mathbf{v}$ for some $\mathbf{v} \in \mathbb{R}_{+}^n$ and consequently,
a positive semi-definite matrix in itself,
can we not simplify the SDP for LRMC as: $$ \min_{\mathbf{X}} \mathrm{tr}(\mathbf{X}) \\ \text{s.t.:} \quad \mathbf{X}_{i,j} = \mathbf{M}_{i,j} \quad \forall (i,j) \in \Omega, \\ \mathbf{X} \succeq 0 $$ given sufficient entries in $\Omega$ and assuming noise-free data?
Please help. Thanks in advance.