Let $A,B,X$ be discrete random variables taking values on finite spaces, and where $A$ and $B$ are statistically independent.
By the law of total probability one has $$P(x_i|A=a)=\sum_b P(x_i | A=a, B=b)P(B=b|A=a)=\sum_b P(x_i | A=a, B=b)P(B=b)$$ At the same time, interchanging the papers of $A$ and $B$ $$P(x_i|B=b)=\sum_a P(x_i | A=a, B=b)P(A=a)$$ Furthermore, suppose that $P(A=a)=1/|A|$ for all $a\in A$ and $P(B=b)=1/|B|$ for all $b\in B$. I thought one should expect to have $$\sum_{a,b} P(x_i | A=a, B=b)=|A|\sum_b P(x_i|B=b)=|B|\sum_a P(x_i|A=a)$$ However with certain given conditional probabilities $P(x|A)$ and $P(x|B)$ I am getting $$|A|\sum_b P(x_i|B=b) \neq |B|\sum_a P(x_i|A=a)$$ Why is this happening, and is there any way by which one can calculate $P(x|A,B)$ given only $P(x|A)$,$P(x|B)$, $P(A)$ and $P(B)$?
The equations $$P(x_i|B=b)=\sum_a P(x_i | A=a, B=b)P(A=a),\quad P(x_i|A=a)=\sum_b P(x_i | A=a, B=b)P(B=b)$$ together with $P(A=a)=1/|A|$, and $P(B=b)=1/|B|$, certainly imply $$\sum_{a,b} P(x_i | A=a, B=b)=|A|\sum_b P(x_i|B=b)=|B|\sum_a P(x_i|A=a)\tag1$$ Therefore, you have made some mistake with your supposed counterexample to $(1)$. I cannot say more without you saying what your counterexample was.
In general, you cannot compute $P(X|A,B)$ given only $P(X|A),P(X|B),P(A)$ and $P(B)$. Consider this situation. Suppose $P(A)=P(B)=1/2$, and that \begin{array}{r|cc} & B & \overline B\\ \hline A & p+\epsilon & p-\epsilon\\ \overline A & p-\epsilon & p+\epsilon\\ \end{array} Each entry in the table gives the conditional probability of $X$ the events in the row and column header. For example, $P(X|A,\overline B)=p-\epsilon$. Note that $P(X|A)=P(X|\overline A)=P(X|B)=P(X|\overline B)=p$, so these conditional probabilities give no information about $\epsilon$, so you cannot compute $P(X|A,B)$ using them alone.