To check completeness I calculated $E(g(x))$ for first distribution and i get it in the polynomial form as
$p(g(0) + 3g(1)-4g(2)) + g(2)$
For Distribution $2$ i get $p^2(g(1)-g(2)) + p(g(0)-g(2)) + g(2)$
In all the questions of completeness we check if $E(g(x)) = 0$ What i thought in this question is that in both the cases if $g(0),g(1),g(2)=0$ i get $E(g(x))=0$ implies both distribution are complete. But i am getting wrong answers someone correct me please.

You need to check whether "$E_p[g(X)]=0$ for all $p$" implies "$P_p(g(X)=0)=1$ for all $p$."
In the first case, show that "$E_p[g(X)]=0$ for all $p$" is equivalent to $g(0)+3g(1)=0$ and $g(2)=0$. However, this does not imply "$g(X)=0$ almost surely for any $p$"; take $g(0)=3$, $g(1)=-1$, and $g(2)=0$ for example.
In the second case, show that "$E_p[g(X)]=0$ for all $p$" is equivalent to $g(0)=g(1)=g(2)=0$, which does lead to "$g(X)=0$ almost surely for all $p$."