I have two unknown matrices $\mathbf{X}$ and $\mathbf{Y}$, and a given matrix $\mathbf{A}$. I performed gradient descent for
$$\text{minimize} \quad \| \mathbf{X} \mathbf{Y} - \mathbf{A}\|_2^2$$
but it doesn't converge to any minima. Is this because the objective function is non-convex if $\mathbf{X}$ and $\mathbf{Y}$ are dependent and convex otherwise? Is it right to say?
As Robert explained in the comments, your function is not convex. (It is still surprising to me that gradient descent doesn't work (so that your function does not attain its infimum) but off of the top of my head I cannot rule out that this can happen.)
Presumably your $X$ and $Y$ are rectangular (otherwise the minimization is trivial). If you are OK using the Frobenius norm, you can solve the problem directly, without gradient descent, using the truncated SVD: see for instance the explanation on Wikipedia.