I'm trying to create an algorithm that merges two sorted arrays of integers of lengths $m$ and $n$ with an overall time complexity of $T(n) = O\left(\log_2(n + m)\right)$, where $T(n)$ represents the algorithm's runtime.
I think I have an idea for a possible solution, however I evaluated it to run in $T(n) = O(\log_2(n) + \log_2(m))$ time. To make sure my idea doesn't conform to the goal runtime complexity above, I started to think about a scalar version of the triangle inequality such that for any $a \geq 2, b \geq 2$,
$$\log_2(a + b) \leq \log_2(a) + \log_2(b).$$
I proved this was the case as follows (again, assuming $a \geq 2, b \geq 2$):
$$a + b \leq ab\\ \Leftrightarrow \log_2(a + b) \leq \log_2(ab)\\ \Leftrightarrow \log_2(a + b) \leq \log_2(a) + \log_2(b).$$
I think this proof is correct and proves that a solution which runs in $T(n) = O(\log_2(n) + \log_2(m))$ would not conform to an upper limit runtime goal of $T(n) = O(\log_2(n + m))$, however I just wanted to be sure before I discarded it from my work. Is this proof accurate? And am I correct in interpreting its implications to my specific circumstances?
Thank you for your time, attention, and consideration!