I have a data set (24 data records) which is in $\mathbb{R}^{13}$ and I need to project it to a lower dimension (at least to $\mathbb{R}^{3}$).
My objective of the dimensionality reduction is to create a starting configuration to optimize the Sammon's error
I initially used orthogonal projection (on to basis vectors $(1,0,0) (0, 1, 0) (0,0,1)$ ) but as suggested in my earlier question it seems a bad idea to directly drop 10 positions.
I have tried Principle Component Analysis (PCA) and found out that the first three principle components only covers 50% of the variance and need eight components to cover 90% of the variance. I think the PCA based reduction is not good enough according to the variance percentages.
I would like to know any other methods which I could achieve a better result in dimentionality reduction or any other method of converting the data set so PCA would yield a better result.
Thank you
I would look at http://en.wikipedia.org/wiki/Kernel_principal_component_analysis with various kernels.
A Gaussian kernel, for example, might transform your data into a set better captured by a small number of PCA components.