The chain rule for mutual information:
$$I(X_1, X_2, ..., X_n; Y) = \sum_{i=1}^{n} I(X_i; Y| X_{i-1}, X_{i-2}, ..., X_1)$$
If we rewrite this in terms of entropy, using the identities $$ I(X;Y|Z) = H(X|Z)-H(X|Y,Z) $$ and $$I(X;Y) = I(Y;X)$$ we seem to get:
$$I(X_1, X_2, ..., X_n; Y) = \sum_{i=1}^{n} H(Y| X_{i-1}, X_{i-2}, ..., X_1)-H(Y|X_{i}, X_{i-1}, ..., X_1)$$
which results in all but two terms canceling:
$$H(Y) - H(Y|X_n, ..., X_1)$$
This seems obviously wrong, but I can't seem to find out where I went wrong.