I have been looking for a method for evaluating the mutual information between a combination of source variables, $X_0, X_1$ and a target variable, $Y$.
$$I(Y;X_0,X_1)$$
When I look on wikipedia's article about multivariate mutual information, they seem to be solving a problem that is more complicated than what I am working on, in that they appear to be keeping the variables separate, and are looking at the various relationships between $X_0,X_1$ and $Y$.
I just want to think of $X_0,X_1$ as a single variable, call it $A$, and calculate:
$$I(Y;A)$$
...but I don't know how to do this. Any help would be much appreciated!
You are right, you don't need to bother with the "multivariate mutual information" you link -that's a rather esoteric and dubious extension of the proper mutual information.
You need nothing special, just the usual definition of $I(Y;X)$. That $X=(X_0,X_1)$ is a multivariate variable does not really change anything:
$$I(Y;X_0,X_1) = \sum_{X_0,X_1,Y} p(X_0,X_1,Y) \log \frac{p(X_0,X_1,Y)}{p(Y)p(X_0,X_1)}$$ or any of the equivalent defintions using conditional entropies.
For example, among many possible expressions, using $I(Y;X)=H(X)-H(X|Y)$ :
$$\begin{array}[rl]\\ I(Y;X_0,X_1) &= H(X_0,X_1) - H(X_0,X_1|Y)\\ &=H(X_0)+H(X_1|X_0)-H(X_0|Y) -H(X_1|X_0,Y)\end{array}$$