I have two random vectors X and Y. I know the mutual information (I) of these two vectors, i.e. I(X, Y). Can I get any general conclusion of I(X, $Y_i$) where $Y_i$ is an element of Y?
That's to say, are there any general relationship between mutual information of the whole vector and individual coordinates?
The only general relationship you can say is that $I(X;Y) \ge I(X;Y_i)$ (this follows from the data processing inequality, since $Y_i$ is a function of $Y$).
The gap in the inequality can be arbitrarily large, however. For example, take $Y=(Y_1,Y_2)$ where $Y_1\in \{0,1\}$ and $Y_2\in \{0,1\}$. Let $Y$ have a uniform distribution. Take $X = Y_1 \text{ XOR } Y_2$. Then, $I(X;Y)=1$ but $I(X;Y_1)=I(X;Y_2)=0$.