I am looking to solve the following otimization:
$\underset{{{{\bf x} \in \mathcal{X}}}}{\text{min}} \; f({\bf x})$
where ${\bf x} \in \mathbb{C}^N$ and $f({\bf x}) \in \mathbb{R}$, $f(x)$ is a convex function and $\mathcal{X}$ is a compact set but not convex.
I am solving the optimization using gradient descent where the iteration consists of the following step:
${\bf x}^{(t)} = proj_{\mathcal{X}}\left({\bf x}^{(t-1)} - \alpha \nabla f({\bf x}^{(t-1)})\right)$,
where $\alpha$ is the step size, $\nabla f({\bf x})$ is the graident and $proj_{\mathcal{X}}(.)$ projects a point on to the set $\mathcal{X}$.
Now suppose I prove that every subsequence ${\bf x}^{t_i}$ of the solution sequence ${\bf x}^{(t)}$ with a limit point $\bar{\bf x}^{*}$, i.e, ${\bf x}^{t_i} \to_{i \to \infty} \bar{\bf x}^{*}$ conerge to a KKT point of the above optimization problem, then based on fact that $\mathcal{\bf X}$ is a compact set and using the Bolzano-Weierstrass theorem (Proof of complex Bolzano–Weierstrass Theorem), can I claim that the whole solution sequence converge to a KKT point of the original problem?