I'm preparing for technical interviews, and I stumbled upon this Q while googling it. The question is bit too long so I will paraphrase it. There are basically two inputs: 2 n-element arrays X and Y (all integers) and value a. This two arrays are unsorted and the algorithm should return 1 if X[l]+Y[r] = a and 0 if there are no such integers. What is the worst case behaviour?
What I think should be correct is: 1)sort the array in non decreasing order 2) let l=0 and r = arraysize -1
3) while l < r
if (X[l] + Y[r] ==a) return 1 //edited
else if(X[l] + Y[r] < a) l++
else r--
4) if no such elements, return 0
I'm not really sure about big oh(self-taught) but if we use mergesort to sort it, it would be O(nlogn).
Can anyone please clarify this?
Yes, your sorts are $O(n \log n)$ and we don't care about factors like $2$. Looking through the sorted arrays is of order $n$