I've been trying to learn more about probability and the websites I have visited are not describing the relationship to each other that well (when to use what, for what purpose in conjunction with what). I am from a programming background and need to see a clear connection/definition. I am lost in the world of complicated math language at the moment! Could anyone help me understand the following terms in a more simple way?
Bayes Decision Theory This results in a formula (Bayes rule) that can be used to calculate the prior and post probabilities.
Prior Probability This is used to calculate what we expect will happen based on sample data sets.
Conditional Probability When and where is this needed? (In calculations that is)
Posterior Probability After observing events of "today", we can guess tomorrow.
Discriminant Function The function that separates two classes for example (a straight line on a $2D$ graph)
Gaussian Distribution This describes a bell-like shape on a $2D$ graph, for example over a given dataset. How does one confirm if a class of samples are of Gaussian distribution?
I will update the question with relevant information! Any help clarifying these points would be great!