I would like to see if someone can show to me that a function exists such that: $$ (f(x)-1)^k = \sum_{j=0}^k {k \choose j}(-1)^{k-j} g(j,k,x) $$ Basically what im trying to say is im trying to find non-trivial solutions that are interesting. This means the following solutions are trivial: \begin{align*} g(j,k,x) = f(x)^{j} \\ g(j,k,x) = (-1)^k f(x)^{k-j} \\ g(j,k,x) = 0^{k-j} (f(x)-1)^j \\ g(j,k,x) = c(k,j)(f(x)\pm m(k,j))^{b(k,j)} \end{align*} Where $c(k,j),m(k,j),b(k,j)$ are some functions with the variables $j,k$ such that would be nonpathological. Any solutions that are similar to those listed above are considered trivial in this instance.Also we are dealing with real numbers here. I have already come up with my own solution to this problem but I would like to see if either someone can prove to my the uniqueness of my solution, or to prove that more than one of such functions exist. Here is how I derived my solution:
It is known that: $$ \left(f(x+z)-f(x)\right)^k = \sum_{n=k}^\infty B_{n,k}^f(x) \frac{z^n}{n!} $$
This is shown in the following paper.
Where $B_{n,k}^f(x)$ is the partial Bell Polynomial with respect to the function $f(x)$, this can also be viewed as: \begin{align*} B_{n,k}^f(x) = \sum_j \frac{n!}{j_1!j_2!\cdots j_{n-k+1}!} \prod_{m=1}^{n-k+1} \left(\frac{f^{(m)}(x)}{m!}\right)^{j_m} \end{align*} Where the sum $j$ runs through all the partitions that satisfy the following linear system: \begin{align*} n = \sum_{m=1}^{n-k+1} mj_m \\ k = \sum_{m=1}^{n-k+1}j_m \end{align*} With that being said, it can be derived that: \begin{align*} B_{n,k}^{x^{-v}}(x) = \frac{n!}{k!} x^{-vk-n} \sum_{j=0}^k {k \choose j}(-1)^{k-j} {-vj \choose n} \end{align*} By implementing this into out definition we have: \begin{align*} \frac{\left((x+1)^{-v}-x^{-v}\right)}{k!} = \frac{x^{-vk}}{k!}\sum_{n=0}^\infty x^{-n-k} \sum_{j=0}^k {k \choose j}(-1)^{k-j} {-vj \choose n+k} \\ = \frac{x^{-vk}}{k!} \sum_{j=0}^k {k \choose j}(-1)^{k-j} \left(\sum_{n=0}^{\infty} x^{-n-k} {-vj \choose n+k}\right) \end{align*} I am not going to show you proof by hand that this converges, I will let wolfram help you with that one: \begin{align*} \sum_{n=0}^{\infty} x^{-n} {-vj \choose n} = \left(\frac{1}{x}\right)^k {-vj \choose k} {}_2F_1\left(1,k+vj;k+1,-\frac{1}{x}\right) \end{align*} Therefore: \begin{align*} \left(\left(1+\frac{1}{x}\right)^{-v}-1\right)^k = x^{-k} \sum_{j=0}^k {k \choose j}(-1)^{j} {k+jv-1 \choose k} {}_2F_1\left(1,k+jv;k+1;\frac{-1}{x}\right) \end{align*} Notice that the monment when $j=0$, the resulting term is zero, therefore: \begin{align*} \left(\left(1+\frac{1}{x}\right)^{-v}-1\right)^k = x^{-k} \sum_{j=1}^k {k \choose j}(-1)^{j} {k+jv-1 \choose k} {}_2F_1\left(1,k+jv;k+1;\frac{-1}{x}\right) \end{align*} This is an example of a solution that is non-trivial in the sense that is defies intuition, note that $v > 0$ and $k >0$, convergence is absolute when $\lvert \frac{1}{x}\rvert < 1$.
This part is for Morgan Rodgers.
Lets assume that the solution i have come up with is considered a trivial solution under the pretenses i have set, then the following is true: \begin{align*} x^{-k} {-jv \choose k} {}_2F_1\left(1,k+jv;k+1;-\frac{1}{x}\right) = c(k,j) \left(f(x)+m(j,k)\right)^{b(j,k)} \end{align*} noting that in the following case, $f(x) = \left(1+\frac{1}{x}\right)^{-v}$ therefore we have $\frac{1}{x} = f(x)^{\frac{-1}{v}}-1$ therefore we have: $$ {-jv \choose k}\left(f(x)^{-\frac{1}{v}}-1\right)^k {}_2F_1\left(1,k+jv;k+1;1-f(x)^{-\frac{1}{v}}\right) = \left(f(x)+m(j,k)\right)^{b(j,k)} $$ If we set $k=3$ we find that: $$ {-jv \choose 3}\left(f(x)^{-\frac{1}{v}}-1\right)^3 {}_2F_1\left(1,3+jv;3+1;1-f(x)^{-\frac{1}{v}}\right) = c(j,k)\left(f(x)+m(j,3)\right)^{b(j,3)} = \frac{{-jv \choose 3}}{\left(f(x)^{\frac{-1}{v}}-1\right)^{3}(jv+1)(jv+2)}\left(6f(x)^{\frac{2}{v}} \left(\frac{(jv)^2+2jv}{jv}\right) +3f(x)^{\frac{3}{v}} \left(\frac{2+2f(x)^j-(jv)^2-3jv}{jv}\right) - 3f(x)^{\frac{1}{v}}\left(jv+1\right)\right) $$ When we simplify we have: $$ c(j,k)\left(f(x)+m(j,3)\right)^{b(j,3)} = \frac{-jv}{6\left(f(x)^{\frac{-1}{v}}-1\right)^{3}}\left(6f(x)^{\frac{2}{v}} \left(\frac{(jv)^2+2jv}{jv}\right) +3f(x)^{\frac{3}{v}} \left(\frac{2+2f(x)^j-(jv)^2-3jv}{jv}\right) - 3f(x)^{\frac{1}{v}}\left(jv+1\right)\right) = \frac{jv}{\left(f(x)^{\frac{-1}{v}}-1\right)^{3}}\left(\frac{jv}{2}f(x)^{\frac{1}{v}}[jv+1]-f(x)^{\frac{2}{v}} [jv+2] +\frac{1}{2}f(x)^{\frac{3}{v}} [jv+3-\frac{2(1+f(x)^j)}{jv}]\right) $$ Although i have not proven explicitly that such functions do not exists, the resultant of the terms do not even resemble anything like the binomial theorem especially considering that there is a lone $f(x)^j$ term in the middle of the convolution. Although this proof is not complete, it provides strong evidence that this solution is most likely either non-trivial by my definition or if such a solution is considered trivial, certainly it will pathological with respect to typical binomial theorem solutions.
There is some mathematics going on behind the scenes that lets you find infinitely many other solutions. If you know some linear algebra, the functions $\{1, f(x), \ldots, f(x)^{k}\}$ span a subspace $W_{k}$ of the set of all real-valued functions on $\mathbb{R}$. If we can take any collection of functions $\{g_{0}(x), \ldots, g_{k}(x)\}$ whose span contains $W_{k}$ then there exist coefficients in $\mathbb{R}$ so that $$f_{i}(x) = \sum_{j} c_{i,j}g_{j}(x).$$ (This is essentially a change of basis transformation). This lets you write $$(f(x)-1)^{k} = \sum_{j=0}^{k} (-1)^{k-j}{k \choose j} f(x)^{j} = \sum_{j=0}^{k} c_{j}g_{j}(x) = \sum_{j=0}^{k}(-1)^{k-j}{k \choose j}c^{\prime}_{j} g_{j}(x).$$ (I'm leaving out exactly how you calculate $c_{j}$ from the $c_{i,j}$, as well as how you get $c^{\prime}_{j}$ from $c_{j}$, but both calculations are relatively straightforward). If $k$ is a variable instead of being fixed then $c^{\prime}_{j}$ is no longer a constant but now depends on $k$, and we can think of the functions as being $g_{j,k}$, since we may want a different set of functions for each $k$. We then think of $g(j,k,x) = c(j,k)g_{j,k}(x)$ (or $c(j,k)g(j,k,x)$, where $c(j,k) = c^{\prime}_{j}$ and $g(j,k,x) = g_{j,k}(x)$).
Some of the most "trivial" examples of this are obtained by simply permuting $\{1, f(x), \ldots, f(x)^{k}\}$ to get a new basis, for example for each $k \in \mathbb{N}$ let $\phi_{k}$ be some permutation on $\{0,1, \ldots, k\}$. Then you can define $$g(j,k,x) = (-1)^{j-\phi_{k}(j)}\frac{{k \choose \phi_{k}(j)}}{{k \choose j}}f(x)^{\phi_{k}(j)}.$$ Now you have that $$\sum_{j=0}^{k} (-1)^{k-j}{k \choose j} g(j,k,x) = (-1)^{k-j}{k \choose j} (-1)^{j-\phi(j)}\frac{{k \choose \phi(j)}}{{k \choose j}}f(x)^{\phi(j)} = \sum_{j=0}^{k} (-1)^{k-\phi(j)} {k \choose \phi(j)}f(x)^{\phi(j)}.$$
But really, the only thing you need is for every $k \in \mathbb{N}$, a collection of real-valued functions on $\mathbb{R}$ $\{g_{0}, g_{1}, \ldots g_{k}\}$ such that the span of $\{ g_{0}, \ldots, g_{k}\}$ contains the span of $\{1,f, \ldots, f^{k}\}$. The coefficients may be difficult to find, but they are guaranteed to exist.
You may have to look up some linear algebra details to understand (with your background it should be easy if you look up whatever definitions you haven't seen before), but I will describe more generally:
Take any real-valued function $f(x)$ on $\mathbb{R}$. For each $k > 0$, let $\{g_{k,0}, g_{k,1},\ldots, g_{k,k}\}$ be a collection of polynomials such that $(1-f(x))^{k}$ is a linear combination of the $g_{k,j}$. This just means there are some coefficients $c_{k,j}$ such that $$(1-f(x))^{k} = \sum_{j}c_{k,j}g_{k,j}(x).$$ For any fixed $k \geq 2$, there are uncountably many choices for the $g_{k,j}$, even I think once you have eliminated all of your simple choices. It is then easy to adjust the $c_{k,j}$ to account for the binomial coefficients and the powers of $-1$ you want to have in front for this sum.
Now, the problem with you eliminating so many examples as "trivial" is that the idea of uniqueness becomes meaningless (and perhaps what you are interested in becomes less clear). For example
Non-trivial example: You won't like this, but choose any function $h(x)$. Define $g(0,k,x) = (-1)^{k}h(x)$, $g(k,k,x) = (f(x)-1)^{k} - h(x)$, and $0$ otherwise. This goes to my point that you will have to go to great lengths to ad-hoc eliminate all sorts of things in order to only have "interesting" examples.
I think there are some very results on orthogonal polynomials and some of the different techniques to change bases in this setting (I think the Bell polynomials even come up somewhere in this, though I don't remember perfectly). I remember Aigner's Enumerative Combinatorics has some interesting things along these lines.