Say I have observed data, and parameters $A,B$:
Parameter $A$ contains possible values: $a_1,a_2,a_3$
Parameter $B$ contains possible values: $b_1,b_2,b_3$
Now, assume I know the likelihood of $a_1,a_2,a_3$ as well as $b_1,b_2,b_3$, I want to know the certain likelihood of combined parameter pair $A$ and $B$, do I have to normalize $a_1,a_2,a_3$ or $b_1,b_2,b_3$?
Given $\mathbb{P}(A \cap B)=\mathbb{P}(A)*\mathbb{P}(B)$, assume $A$ and $B$ are independent.
Is it true that:
$\text{Likelihood}(a_1$ and $ B)$ $=\text{Likelihood}(a_1)*\text{Likelihood}(B)$
or should I normalize $a_1$ as following:
$\text{Likelihood}(a_1$ and $ B)$ $=\text{Likelihood}(a_1)/\sum (\text{Likelihood}(a_1),\text{Likelihood}(a_2),\text{Likelihood}(a_3))*\text{Likelihood}(B)$
I see some operations in others code like this:
$\text{Likelihood}(a_1$ and $ B)$ $=(\text{Likelihood}(a_1)-\max(\text{Likelihood}(a_1),\text{Likelihood}(a_2),\text{Likelihood}(a_3)))*\text{Likelihood}(B)$
Which is one is the right way to deal with this problem?
notation: $\text{Likelihood}(a_1$ and $ B)$ is short for $\mathbb{P}(data|a_1,B)$