I am looking at the Schaum's Outlines "Tensor Calculus" by David C. Kay, and on page 3, the following non-identity and identity are presented:
$$ \begin{align} a_{ij}(x_i + y_j) &\neq a_{ij} x_i + a_{ij} y_j\tag{1} \\ a_{ij}(x_j + y_j) &= a_{ij} x_j + a_{ij} y_j \tag{2} \end{align} $$
Let's assume that $i,j$ both run over $\{1,2\}$ for simplicity.
For (2), I write:
$$\begin{align} a_{ij}(x_j + y_j) &= a_{i1}(x_1+y_1)+a_{i2}(x_2+y_2) \\ &= a_{i1}x_1+a_{i2}x_2+a_{i1}y_1+a_{i2}y_2 \\ &= a_{ij}x_j+a_{ij}y_j \end{align}$$ where $i$ is a free index, and thus only $j$ is summed over.
For (1), I write: $$\begin{align} a_{ij}(x_i + y_j) &= a_{11}(x_1+y_1)+a_{12}(x_1+y_2)+a_{21}(x_2+y_1)+a_{22}(x_2+y_2) \\ &= (a_{11}+a_{12})x_1+(a_{21}+a_{22})x_2+(a_{11}+a_{21})y_1+(a_{12}+a_{22})y_2\\ &=\ (a_{i1}+a_{i2})x_i+(a_{1j}+a_{2j})y_j \end{align}$$ Is that correct? Can we go further, or is that as much as we can derive?
EDIT it seems that in (1) the inequality results from the RHS being illegal: in one term on the RHS, i is free, whereas in the other, j is free. The LHS however is legit, and the derivation I propose seems to hold, but cannot proceed any further. In (2), the situation is much better, because the index i is free in both the LHS and RHS.
EDIT 2 the LHS for (1) is actually illegal and meaningless because $x_i+y_j$ is not a proper vector. That combination of $i$ and $j$ indices does not make sense. So for (1) both the LHS and RHS are illegal, and rather than an inequality, it should have bee pointed out that neither the LHS nor the RHS can be written as such.
Your Eq (1) is wrong because it's not a good tensor equation. The quantity $x_i + y_j$ doesn't have any meaning.
Remember that index notation is used in this manner to simplify the writing of tensors. So, $a_i$ is a shorthand notation to say "the components of the tensor $\mathbf{a} = \sum_i a_i \hat{e}^i$ (written in the basis $\hat{e}^i$)".
When you write $x_i + y_j$, this is not the shorthand for any tensor.
It may help for you to understand tensor math in terms of the underlying tensors, and recognize that index expressions are simply a shorthand to express how the components work in a given (possibly abstract) basis.
For example: if you have tensors $\mathbf{x} = \sum_i x_i \hat{e}^i$ and $\mathbf{y} = \sum_i y_i \hat{e}^i$, then summing the two together, you get $\mathbf{x} + \mathbf{y} = \sum_i (x_i + y_i) \hat{e}^i$. If you then want to contract the resulting sum with the first slot of a two-tensor $\mathbf{a} = \sum_{ij} a_{ij} \hat{e}^i \otimes \hat{e}^j$, then perform the contraction. This leads to the tensor $a_{ij} (x_i + y_i) \hat{e}^j$, which has components $a_{ij}(x_i + y_i)$.
(Note: In this post, all sums can be neglected in favor of Einstein summation convention if desired.)