I have asked about this theorem before but found lately that I still don't fully understand its proof. Here are the rules of the game described. A closed interval in $\mathbb{R}$ denoted $I_0$ is given, and a subset $A \subset I_0$. Player (A) is "dealt" the subset A, and player (B) is "dealt" the subset $B=I_0 \setminus A$
step 1: First player ,(A), chooses a closed subinterval $I_1 \subset I_0$.
step 2: Second player (B) chooses a closed subinterval $I_2 \subset I_1 \subset I_0$...
step $2n-1$: firat player (A) chooses a closed subinterval $I_{2n-1} \subset...\subset I_2 \subset I_1 \subset I_0$
step $2n$: Seond player (B) chooses a closed subinterval $I_{2n} \subset I_{2n} \subset...\subset I_2 \subset I_1 \subset I_0$
The process is continued for all $n \in \mathbb{N}$. we get a decreasing nested sequence of closed segments. If the intersections of all segments contains a point in $A$ then (A) wins. Otherwise (B) wins.
The theorem states that:
There exists a strategy in which (A) wins iff $I_1 \cap B$ of first category
The proof if given below. My problem is: I don't get the part where it is said that, If (A) has a winning strategy, he can always modify it so as to insure that the intersection of the intervals $I_n$, will consist of just one point of $A$. What does that mean "he always chooses $I_{2n+1}$ as if $I_{2n}$ had been a subinterval half as long"? and how does that helps?
Thank you!
Shir

I hope I'm not bothering you by posting again. The idea is that, if the sequence of intervals is getting smaller by 1/2 each time, their intersection is one point. But B might be picking pretty big intervals each time. How can A modify his strategy? He pretends B picked a smaller interval than he really did, and uses his previous strategy on that smaller interval.
A can pretend that B picked a smaller interval, because all A has to do is get another interval inside of B's choice. It's like pretending a bullseye is smaller than it really is; if you hit the small bullseye, you hit the bigger one, too.