Consider the following.
A^B^C^D and A + B + C + D
(^ represents bitwise operation XOR)
You are given A + B + C + D, and you have to find A^B^C^D. I know there is the relation between XOR and Addition but I unable to figure out how to utilise this fact. I am not even sure if it's the right direction to think. So, my question is how do I do this?
Also, I don't get how XOR is used to find addition. I tried searching on google, but it's too sophisticated to understand. I couldn't find a layman approach for the same. If you guys can explain in layman's terms.
Unfortunately, what you're looking for is impossible. Looking at the two-number case: if you are given $A+B$, it is impossible to recover $A \wedge B$. For example, suppose that $A + B = 5$. We could have $$ A = 0, B = 5 \text{ or } A=1,B=4 \implies A \wedge B = 5,\\ A = 2, B = 3 \implies A \wedge B = 1. $$ It is impossible to know which value for $A \wedge B$ is correct using only $A+B$.