Proving a greedy algorithm - find minimum number of terms in expression $n=a_1!+\cdots+a_k!$

89 Views Asked by At

Given a number $n$ (integer) I should find $a_1,\ldots,a_k$ such that $k$ is minimal and $n=a_1!+\cdots+a_k!$.

I think that the following greedy algorithm should give the solution - iterating over $1!,\ldots,k!$ I find the biggest k such that $k!\leq n$ and then the solution is $1+{}$solution for $n-k!$.

I'm new to proving such algorithms and I've read that at first I should prove that there exists optimal sum containing $k!$ by picking an arbitrary optimal solution and exchanging something else with $k!$ and prove that the solution remains optimal after this operation. But I don't see how that could be done here. Any hints?