Consider the following experiment.
We throw a three-sided die with sides $1$, $2$ and $3$ infinitely many times. Let $T_i$ denote the outcome of the $i$'th throw. Define $N:=\min\{i:T_i\neq1\}$. Let $X$ be the event that $T_N=2$ and let $Y$ be the event that $T_N=3$.
Some calculation (*) leads to the result that $\mathbb{E}(N)=\mathbb{E}(N|X)=\mathbb{E}(N|Y)=3/2$.
Let $Z$ be the event that $T_i\neq3$ for all $i$. Some calculation (**) leads to the result that $\mathbb{E}(N|Z)=2$.
I find it very unintuitive that $\mathbb{E}(N|X)\neq\mathbb{E}(N|Z)$. Obviously we have $Z\subsetneq X$. However, the information $Z$ gives, which $X$ does not give, intuitively only affects what comes after the $N$'th throw. So how is it possible that the probability distribution of $N$ is different when conditioning on $X$ or $Z$?
(*) We have $\mathbb{P}(X)=\mathbb{P}(Y)$ and $\mathbb{E}(N|X)=\mathbb{E}(N|Y)$ by symmetry. Also notice that $X$ and $Y$ partition the event space, so $\mathbb{P}(X)+\mathbb{P}(Y)=1$, so $\mathbb{P}(X)=\mathbb{P}(Y)=\frac12$. Since $\mathbb{P}(T_i\neq1)=2/3$, we have $\mathbb{E}(N)=3/2$. By the principle of divide and conquer, we have $\mathbb{E}(N)=\mathbb{P}(X)\mathbb{E}(N|X)+\mathbb{P}(Y)\mathbb{E}(N|Y)$, so we find $\mathbb{E}(N)=\mathbb{E}(N|X)=\mathbb{E}(N|Y)=3/2$.
(**) We have $\mathbb{P}(T_i\neq1|Z)=1/2$, so $\mathbb{E}(N|Z)=2/1=2$.
By the way, if you have a suggestion for a better, more specific title, be my guest. I could not come up with a good descriptive title for this very specific question.
In all honesty, I am confused why you are bringing $X$ into the situation. Why aren't you just saying "The information $Z$ gives only affects what comes after the $N$'th throw, so why isn't $E(N) = E(N | Z)$?" The issue here (as well as in your actual question) is that you have the quantifiers/chronology backwards; $N$ isn't determined yet.
You are of course correct that once we know $N$, then $X$ and $Z$ provide the same information up to the $N^{th}$ throw. However, being in the world of $Z$ significantly affects the (expected) value of $N$. View $E(N | Z)$ as saying "I guarantee you that 3 won't be rolled; how long will you have to wait on average to see a 2?", the (partial) answer to which is obviously "longer than I would have to wait to see a 2 or 3 with no other conditions". I.e., rolling a non-1 on a two-sided die takes longer than rolling a non-1 on a 3-sided die.
To see it more clearly, start from scratch. Consider only a 2-sided die with sides 1,2. Let $T_i$ be the outcome of the $i^{th}$ throw and $N = \min\{i : T_i \not = 1\}$. Let $Z$ be the event that $T_i \not = 2$ for all $i$ (or, if you want to make $Z$ have positive measure, define $Z$ to be the event that $T_i \not = 2$ for $i = 1,2,\dots,10^{100}$). Then $E(N) = 2$ while $E(N | Z) = \infty$ (or some very very large number).
The moral: you shouldn't look at the information provided by an event that is conditioned upon once the random variable has been realized; you must look at how the conditioning first influences the random variable.