You can have more than one 1 or 5.
The dice an be all 1s, all fives, or have one 1... or just one 5, and everything in between.
This confuses me because, in the basic sense, adding probabilities (such as 2/6) five times brings a number that exceeds 1, which obviously doesn't make sense. Obviously, the five dice being thrown have a very real chance of turning up nothing but 2s, 3s, 4s, and 6s.
What is the appropriate process?
Let $ A_i $ be the event " dice $i$ is not 1 nor 5" $$ \forall i \in \{1, \cdots ,5\}, \ P(A_i)=\frac 2 3$$ The events are independent therefore $P(A_1\wedge A_2\wedge A_3 \wedge A_4 \wedge A_5)=\left(\frac 2 3\right) ^5$
In reality
$P(A_1\wedge A_2\wedge A_3 \wedge A_4 \wedge A_5)=P(A_1\mid A_2\wedge A_3 \wedge A_4 \wedge A_5)P(A_2\mid A_3 \wedge A_4 \wedge A_5)P(A_3\mid A_4\wedge A_5)P(A_4\mid A_5)P(A_5)$
Let $B$ the event "obtaining a 1 or 5" $$P(B)=1-P(A_1\wedge A_2\wedge A_3 \wedge A_4 \wedge A_5)=1-\left(\frac 2 3\right) ^5$$