Dynamic programming:Making a Change

2k Views Asked by At

I'm practicing problems on dynamic programming.The problem is as follows:

You are given n types of coin denominations of values v(1) < v(2) < ... < v(n) (all integers). Assume v(1) = 1, so you can always make change for any amount of
money C. Give an algorithm which makes change for an amount of money C with as
few coins as possible.

Here is the solution which i have figured out..

So,the idea is to make a change such that we need fewer coins in making that be it of any denomination.

So,this is all about choices... If i need for 1 dollar i have one option i.e to select v1 -----> (1) this is an option

If for example ,we are making a change for 10 dollars

     --->pick ten 'one' dollar notes
     ---->pick two  'five dollar notes
     ----->pick a  'ten' dollar note itself..

we need to calculate the minimum of these choices and return it as our
solution.

Is the approach i'm following is correct?

2

There are 2 best solutions below

0
On

Trying all possible solutions is an exhaustive search. The following is a dynamic programming algorithm. The minimal solution we search for can be defined recursively:

if m=v(k), for a k, then:
    coins_needed(m):=1, 
otherwise:   
    coins_needed(m):=
        min(coins_needed(1)+coins_needed(m-1),
        coins_needed(2)+coins_needed(m-2),
        ...,
        coins_needed(floor(m/2)+coins_needed(ceiling(m/2)))

One can start with m=1 and , then calculate coins_needed for m=2 and so on. It is important to store all the minimal solution one has calculated so far, otherwise the algorithm needs an exponential running time.

If one has calculated coins_needed(m-1) in this way, one has already calculated coins_needed(1),...,coins_needed(m-2) and stored the results. It is now easy to calculate coins(m) according to the above formula.

It is dynamic programming as defined in Wikipedia, The problem has optimal substructure, because the problem solution can be constructed of the solution of the smaller subproblems, and each subproblem can be broken in smaller sub problems again but we have overlapping subproblems. This means we have always the same sub problems that can be reused. So if we store the results of the subproblems (memoization) we can reuse them.

0
On

To help you get the idea: suppose you want to make $\$20$. Instead of trying all the possible combinations (brute force), suppose that you already know the optimum way of making $\$19$ (let call $n(19)$ the number of coins needed for that, and $d(19)$ the list of coins needed).

Sure, we could add one $\$1$ coin, and conclude that $n(20)=n(19)+1$, but this would be wrong, this is not necessarily optimal - perhaps combining the sum of $\$15$ plus a $\$5$ coin does better. (important: $n(j)$ is not necessarily increasing).

However, if we not only know $n(19)$ (the optimum for the previous step), but also $n(j)$ for all $1\le j < 20$, then we can compute all the alternatives $n(j)+n(20-j)$ and pick the minimum (unless there is a coin with $v_k=20$). So one algorithm is:

$$ n(j)=\cases{1 & if $ v_k=j$ for some $k$\\ \min_{i=1 \cdots j-1}n(i)+n(j-i) & elsewhere} $$

This starts with $n(1)=1$. A similar recursion, with concatenations instead of sums, allows to build $d(j)$, the optimum lists of coins.

Here's a fiddle