I've been racking my brain trying to figure this out. I think I came up with a solution but its not elegant at all, I was wondering if any of yall could think of anything else. This is the problem
In the 2-Way-2- Sum decision problem, we are given arrays A and B of length
n containing (not necessarily distinct) integers, and we must determine whether there are
two pairs of indices i1, j1, i2, j2 ∈ {1, 2, . . . , n}, i1 ̸ = i2, j1 ̸ = j2 for which
A[i1] + A[i2] = B[j1] + B[j2].
Design an efficient algorithm that solves the 2-Way-2- Sum problem and has time com-
plexity O(n2 log n) in the setting where operations on individual integers take constant
time.
The approach I came up with is to initially sort the arrays, then iterate possible sums of one of the arrays and store it. Then I can iterate through the second array checking if any of the sums are present in the other taking n^2log n time because im searching the sum array using binary search inside two for loops. However is there a more efficient and elegant approach that doesnt require seperate memory use. I was thinking of something like this but im not sure if it makes sense
void solve(int i1, int i2, int j1, int j2){
if(i1 >= i2 || j1 >= j2)
return false;
si = A[i1] + A[i2];
sj = B[j1] + B[j2];
if(si == sj)
return true;
else if(sj < si){
solve(i1, i2 - 1, j1, j2);
solve(i1, i2, j1 + 1, j2);
}
else{
solve(i1+1, i2, j1, j2);
solve(i1, i2, j1, j2 - 1);
}
}
Does the code above make sense to anyone else? Thanks.