I have never understood the application of P & C, even in the software programming. I know, the principle behind P&C is counting. So, basically it makes the counting quicker. However, I could never apply it while programming,real life, etc? The only application I could find was very trivial - how many 2 digit binary numbers are possible? The answer is 00, 01, 10, 11. Since the numbers can repeat here - the answer is 2*2, which is 4. Apart from this, I couldn't find its application? Can someone please help me on it? This way, I can appreciate this topic in mathematics.
As per the people comments - this question is quite vague. Thus, I am rewording it to - suffice it's need in programming world like - Embedded, Graphics, Business, etc
At least in my experience, the usual reason for wanting to count something is to compute a probability. If, for example, you are playing a card game (I'm thinking of the standard 52 card deck here) in which the cards are shuffled, and you want to know the probability of some event, then you need to know the total number of possible orderings of the cards, which is a (fairly straightforward!) permutation problem.
In that example, all the cards are distinct, but maybe you don't care about suits (or maybe you're playing an entirely different card game in which there really are multiple copies of some cards), so there are four copies of each card - now you have a combination problem to work out the total number of shuffles.
I'm sure there are many more interesting examples of counting problems that reduce to counting permutations or combinations of some set of objects - maybe people who spend more time computing probabilities than me have some examples.