If $\mathbf{X}_1 = (X_{11}, \ldots, X_{1k})$ and $\mathbf{X}_2 = (X_{21}, \ldots, X_{2k})$ are i.i.d. random vectors, how do we know that $\mathbf{a}\cdot \mathbf{X}_1$ and $\mathbf{a}\cdot \mathbf{X}_2$ are i.i.d. (for $\mathbf a \in \mathbb R^k$)? The below proof seems to use this fact, and I don't know how to prove it, because I don't think the $X_{11}, \ldots, X_{1k}$ are necessarily independent, just that the two vectors must be.
This source: https://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter7.pdf says that if $X$ and $Y$ are independent, their sum's distribution function is only dependent on $X$ and $Y$'s distributions (i.e. in this case, if $X =_d X'$ and $Y =_ d Y'$ and $X$ and $Y$ are independent and $X'$ and $Y'$ are independent, we have that $X+Y =_d X'+Y'$). But I don't know what to do in the case that they might not be independent.

We have that two variables $X,Y$ are independent if the cumalative distribution functions
therefore if $X,Y$ are independent then we have that
Similarly for the identical part if
then we must have that
by a similar arguement.