Max Sum Algorithm vs Simplex Method

85 Views Asked by At

I have a set of functions, each taking some variables, and I want to maximize the combined return values of all functions.

I've come across two different algorithms into which I might need to deep dive to solve this, but I'm not sure which one is more appropriate for what conditions, and how they differ.

One is called the Max Sum Algorithm described here

And other is the Simplex method described here

Before I spend a lot of time studying both of them to decide what they do and how they differ, I thought to post here. Any thoughts?

1

There are 1 best solutions below

0
On

You linked a 20 minute video for the "Max Sum" algorithm which I didn't have time to watch, but I can say that ANY function you want to maximize can be expressed trivially as a sum of arbitrarily many individual functions, so probably a general derivative-free method like Nelder-Mead (simplex) algorithm is your best way to go if your functions don't change too much over small variable changes and you don't know how to express the derivatives of your sum of functions you want to optimize. The only thing is, at best, Nelder-Mead only converges to a local optimum so you probably want to run it with many initial starting values for your variables, and then take the best solution found.