Suppose $f(x,y) = \prod_{i=1}^n (a_ix+b_iy)$
where $n$ is a constant larger than 500, and $a_i>0$, $b_i>0$ are known coefficient. There is only one global maximum.
What's the most efficient method to find positive $(x,y)$ that maximizes $f(x,y)$ under the constraint that $c_1x+c_2y=1$, where both $c_1$ and $c_2$ are larger than 0.
Let $$ \begin{align*} p_i &= a_i/c_1 > 0,\\ q_i &= b_i/c_2 > 0,\\ u &= c_1x,\\ v &= c_2y. \end{align*} $$ Then your maximization problem is equivalent to minimizing $$ g(u)=-\log f(x,y)=\sum_{i=1}^n -\log [(p_i-q_i)u + q_i];\quad u\in(0,1). $$ One may verify that $g''(u)=\sum_{i=1}^n \frac{(p_i-q_i)^2}{[(p_i-q_i)u + q_i]^2}$, which is positive if $p_i\neq q_i$ for some $i$. Therefore, on $(0,1)$, $g$ is either a strictly convex function when $p_i\neq q_i$ for some $i$, or a constant function otherwise.
Assume that $g$ is strictly convex. If $g'(0)=\sum_i (\frac{p_i}{q_i}-1)<0$ and $g'(1)=\sum_i (1-\frac{q_i}{p_i})>0$, then $g$ has a unique global minimum inside $(0,1)$, otherwise $g$ is strictly increasing/decreasing on $(0,1)$ and hence it has no extremum on the open interval.
When $g$ does have a unique global maximum, as it is univariate, twice differentiable and its derivatives are easy to compute, I think Newton's method is very hard to beat. However, since this is actually constrained optimization, care must be taken so that the iterates do not fall outside $[0,1]$. You may insert a bisection search step or a golden search step (see hardmath's answer) when the Newton iterate overshoots.