I am new to optimization problems and would appreciate some insight into solving a fairly simple one.
Referring to the diagram below, let's assume we have two functions, f(x) and g(x). We can assume f(x) has some negative linear/polynomial correlation and g(x) has some positive linear/polynomial correlation. We likewise have two constraints: yf which denotes the maximum valid output of f(x); and yg which denotes the maximum valid output of g(x).
Given f(x), g(x), yf, and yg; we are tasked with finding the corresponding x input values within the valid range for: minimizing f(x), minimizing g(x), and minimizing both f(x) & g(x).

Minimizing for f(x) and g(x) respectively is fairly straightforward using the inverse functions, x1 = f-1(yf) and x2 = g-1(yg).
So input x1 will give the min for g(x) and the max for f(x) within the valid range, and conversely input x2 will give the min for f(x) and the max for g(x) within the the valid range.
So the next question is, what method can be used to minimize both considering any variations on functions f and g?
In order for optimization to be well-defined, your target needs to be a partially ordered set, and more typically is a totally ordered set. When asking to minimize both $f$ and $g$, you are considering the map $x \to (f(x), g(x))$ which means your target is $\mathbb{R}^2$. While you can put orderings on this set, a much more natural way to handle this is to define a map $\mathbb{R}^2 \to \mathbb{R}$ and compose your original map with this one.
For example, a natural choice is to minimize $f(x) + g(x)$. Or if you care about $f$ more than $g$, you could choose to minimize $3f(x) + g(x)$.
However you must always make a choice as there is no "natural" ordering on $\mathbb{R}^2$ in regards to optimization.