I am solving a complex scheduling problem by formulating it as a mixed integer linear problem and using the branch-and-cut algorithm. I have several different input parameters which can be changed - for instance, the number of different demand orders, the number of workers and machines available.
Is there a way of deriving an expression in big O notation for the time complexity of the algorithm? Should I evaluate different input parameters separately (keeping others fixed), check how the solution times required when varying one input parameter change and try to fit a trendline?
Big-oh complexity is not something that can be estimated empirically from test runs. It is a theoretical worst-case asymptotic bound. Many MIP models are NP-hard (exponential complexity, if not worse). If you want to prove yours is too, you need to find a polynomial-time reduction to a known NP-hard problem.
If you are just interested in "average" run times (which are software and hardware dependent), you need a sample of "representative" problem sizes. I don't think varying one dimension at a time is a safe bet, since there could be interactions among them.