I'm having trouble working out an example on conditional expectation presented in Rice's Mathematical Stastistics 3rd ed. The following is the excerpt:
The problem pertains to the Law of total expectation, but in order to get there I have to use the conditional expectations. In particular the claim that $E(T|N = n) = nE(X)$ is what I'm trying to work out.
First the conditional expectation of $Y$ given $X = x$ in the discrete case is defined as:
$$E(Y|X = x) = \sum_{y}y \ p_{Y|X}(y|x)$$
We are given $$T = \sum_{i = 1}^{N}X_{i}$$
So I interpreted a fixed realization as $$t = \sum_{i = 1}^{n}X_{i}$$
So using the definition of a conditional expectation and the above
$$E(T|N = n) = \sum_{t}t \ p_{T|N}(t|n) \\ = \sum_{t}\bigg(\sum_{i = 1}^{n}X_{i}\ p_{T|N}(t|n)\bigg)$$
But I'm stuck trying to interpret $p_{T|N}(t|n)$ correctly. I could see $E(X)$ forming but there are some small steps that I can't seem to see in between. What is it I'm missing from my derivation?

$\mathsf E(T\mid N=n)$ is the conditional expectation for the sum of $N$ iid random variables $\{X_k\}$, when given that random variable $N$ happens to equal $n$.
That is just the expectation for the sum of $n$ iid random variables.
The Linearity of Expectation then applies. The expectation for a sum of random variables is the sum of the expectations for those random variables.
Therefore we have just $n$ times the expectation of any one of those random variables, since they are each identically distributed.
$$\begin{align}\mathsf E(T\mid N{=}n)&=\mathsf E(\sum_{k=1}^N X_k\mid N{=}n)\\[1ex]&=\mathsf E(\sum_{k=1}^n X_k)\\[1ex]&=\sum_{k=1}^n\mathsf E(X_k)\\[1ex]&= n\mathsf E(X_1)\end{align}$$
The rest is the Law of Total Expectation, and Linearity again (since $\mathsf E(X_1)$ is a constant).
$$\begin{align}\mathsf E(T)&=\mathsf E(\mathsf E(T\mid N))\\[1ex]&=\mathsf E(N\mathsf E(X_1))\\[1ex]&=\mathsf E(N)~\mathsf E(X_1)\end{align}$$