How to define costs for optimal factorization of functions (in some sense)?

34 Views Asked by At

Say I have a function $t\to f(t)$ that can for some unknown functions $t\to g(t),t\to h(t)$ be written $$f(t) = g(t)h(t)$$ What are the conditions I can put on $g$ and $h$ to get a unique, meaningful solution? Given some constraints of how to construct $g$ and $h$.

Example $f$ is in the realm of polynomials. Reasonable would be that the combinations for $g$ and $h$ would be the combinations of polynomial factors. This would be "least complicated" in some sense, at least when comes to $g,h$ still belonging to polynomials . Example $$\cases{f(t) = (t+3)(t-2)(t-1)\\ g(t) = t+3 \\h(t) = (t-2)(t-1)}$$

But this might not always be unique. Another example:

$$ \cases{f(t) = \sin(t)^2\\g(t) = \sin(t)\\h(t) = \sin(t)} \text{ as well as } \cases{f(t) = \sin(t)^2\\g(t) = 1-\cos(t)\\h(t) = 1+\cos(t)}$$

The exponential family provide us with uncountably many other ones (of course).

Another famous example in the realm of rational functions :

$$\sum_{i=0}^Nt^k = \frac{t^{k+1}-1}{t-1}$$

One could argue that a polynomial of many terms is more complicated than the rational function on the right.

Is there some systematic way we can put a cost on how to choose $g,h$ to get a unique factorization?