A 6-sided die is tossed once. Two events X and Y are defined. X is the event in which an even number comes up and Y is the event in which the number is a multiple of 3. The value of H(X|Y) needs to be computed using two different ways.
This is how I tried to solve it.


This means that X and Y are independent. So H(X|Y) = H(X) and I figured I only need to compute 
To compute H(X|Y) using a different way, I tried to use this formula:

I thought in all the cases but {6}, p(x, y) is zero, because the die is tossed only once and we can not have different values. p(6,6) is 1/6. Based on this reasoning H(X|Y) is 1/6 * (2.58) = 0.43
I had some confusion computing p(x|y) too. I took p(6|6) to be 1/6 but I doubt if it was a correct value, because if one interprets p(6|6) as the probability that {6} shows up from X if {6} has already showed up in Y, then p(6|6) is 1. but log(1) is zero and that also doesn't make anything better.
What goes wrong in my reasoning and why do the two computed values for H(X|Y) turn out to be so different?
You've got the wrong support for the events' variables.
An even number either comes up or it does not. This event can be measured by an random variable which takes the values $1$ when it happens, or $0$ if it does not. Likewise for the event of a number divisible by 3.
Thus $X\in\{0,1\}$ and $Y\in\{0,1\}$ are indicator random variables, that depend on the value of $Z\in\{1,2,3,4,5,6\}$, which the random variable of the die's result. Specifically: $$X=\operatorname {\bf 1}_{\{2,4,6\}}(Z) , \; Y=\operatorname {\bf 1}_{\{3,6\}}(Z)\\p_X(0)=\tfrac 1 2, p_X(1)=\tfrac 1 2\\ p_Y(0)=\tfrac 2 3, p_Y(1)=\tfrac 1 3\\p_{X,Y}(0,0)=\tfrac 1 3, p_{X,Y}(1,0)=\tfrac 1 3, p_{X,Y}(0,1)=\tfrac 1 6, p_{X,Y}(1,1)=\tfrac 1 6$$
So indeed they are independent as: $p_{X,Y}(x,y) = p_X(x)p_Y(y)$ for all $(x,y)\in\{0,1\}^2$.
Thus : $p_{X\mid Y}(x\mid y)=p_X(x)$
The entropy of $X$ is $$\begin{align} H(X) &= - \sum_{x\in \{0,1\}} p_X(x)\log p_X(x) \\ & = - (\frac 1 2 \log \frac 1 2+\frac 1 2 \log \frac 1 2) \\ & = \log 2 \end{align}$$
The conditional entropy of $X$ given $Y$ is: $$\begin{align} H(X\mid Y) &= -\sum_{y\in\{0,1\}} \sum_{x\in\{0,1\}} p_{X,Y}(x,y)\log p_{X\mid Y}(x\mid y) \\ & = -\sum_{y\in\{0,1\}} \sum_{x\in\{0,1\}} p_{X,Y}(x,y)\log p_{X}(x) \\ & = -\left(\tfrac 1 3\log \tfrac 1 2+\tfrac 1 3\log \tfrac 1 2 +\tfrac 16\log \tfrac 1 2 + \tfrac 1 6\log \tfrac 1 2\right) \\ & = \log 2 \end{align}$$