What is the simplest algorithm to implement, to impose semi-definite constraints?
$\min_{X\succeq 0} f(X) $, where $X$ is an $n \times n$ symmetric matrix, and $f$ is a general smooth convex nonlinear function.
Suppose size of $X$ is small, e.g. $n = 20$.
I myself can name some methods (like interior point methods, etc), but have not experience in implementation, so do not know which one is simpler to start with.
I'd say a projected gradient method is likely going to work well for a simple problem like that. That is, alternate between gradient steps $$X_+ = X - \alpha \nabla f(X)$$ and projection steps: $$X_{++} = \mathop{\text{arg}\,\text{min}}_{X\succeq 0} \|X-X_+\|_F$$ The projection is relatively simple: given a Schur decomposition $X_+=U\Sigma U^T$, then $X_{++}=U\Sigma_+ U^T$, where $\Sigma_+$ is formed by replacing any negative (diagonal) elements of $\Sigma$ with zero: $(\Sigma_+)_{ii}=\max\{\Sigma_{ii},0\}$.
If $f(X)$ is not strongly convex, you may want to consider an accelerated first-order method; search for "Nesterov's optimal gradient" or "Nesterov's accelerated gradient" in your favorite search engine for a start. My Matlab toolbox TFOCS might be helpful for you here, but it is by no means necessary.
An interior-point method is OK, and would be good if you need high accuracy. But it can be a bit difficult to assemble the Newton system. Certainly, with a problem of your smaller size, it will be tractable on a PC.
Do you have a specific $f$ in mind?