why is average of a random subset the same of the initial set

75 Views Asked by At

This feels like a really simple question, plus it looks very intuitive, but I can't find a formal proof or anything that proves it.

I have an initial set X with n values. I extract Y, a random subset of X of size l.

Why is it that the Expected average of Y is equal to the average of X?