I am performing Naive Bayes classification on the spam/ham dataset. I understand how Naive Bayes works, and have it implemented in few lines of Matlab code. I was told that cross-validation can be used to learn the classifier parameters.
What I don't understand is that, I am assuming that the probability of $P(x_i | C_k)$ follows a Gaussian distribution. In such a case, how exactly do I perform cross validation? I know how to do k-fold cross validation, but what is it that we actually get to learn during the folds? What difference would we have, as compared to simply obtaining the prior probability $P(C_k)$ and $\mu_k$, $\sigma_k$ from the entire training data, and applying it to the test data by fitting the test parameters to a normal distribution?
Thanks!