In the case of finite sets, e.g. $A=\{1,2,3,4\}$, if some asked me, "What percentage of the numbers are even in $A$?", I would respond with $50 \%$. If asked for a proof, I'd demonstrate that two of the four elements in the set are even.
Now, consider the set $B=\mathbb N$, which is countably infinite. If someone asked me, "What percentage of the numbers are even in $B$?", intuitively, I would think the answer is $50 \%$. However, I'm not sure how I would go about proving this.
Finally, consider the set $C = \mathbb R \setminus \{0\}$, which is uncountably infinite. If someone asked me, "What percentage of numbers are greater than $0$ in $C$?", I would think the answer is $50 \%$. Once again, I'm unsure of how I could demonstrate this must be true.
So as the title of this post states, does the notion of "percentages" extend to countably infinite and uncountably infinite sets?
It does, although the word "percentage" is not used in those cases, instead the word used is "probability" (if you are doing statistics) or "measure" (if you are doing mathematics).
For a countable set like $\mathbb N$ you need to assign a probability $p_n \in [0,1]$ to each $n \in \mathbb N$, subject to the rule that $\sum_{n \in \mathbb N} p_n = 1$ (which is therefore an convergent infinite series); that rule is the analogue in probability theory of the rule that $100\%$ means "everything". Examples of this include the Poisson distribution which is used to model the probability that $n$ people will walk through that door in the next minute.
For an uncountable set like $\mathbb R$ you can generally use integral calculus to define probabilities: instead of convergent infinite series on uses convergent improper integrals. Examples include Gaussian distributions, also known as "normal distributions", which are used to model and estimate the proportion of heads to expect when you flip a zillion coins.