I'm reviewing algorithms, and I've come across this problem. At first, it seemed like an interval scheduling problem to me, but now I think it is a dynamic programming problem. I'm not sure how to solve this. I'd appreciate if someone could show me a solution.
Suppose you are an employee. Your boss has sent you to attend a conference. The conference lasts $n$ minutes. In the course of this conference, there are $m$ talks you can attend. Your boss has ordered that you must attend talks for the entirety of the conference, i.e. all $n$ minutes.
The talks have two quantified properties: a tuple (start-minute, end-minute) and stressfulness, a variable fixed cost incurred whenever you enter a talk. You seek to minimize the total stressfulness of the conference. Moreover, you are free to leave or enter a talk whenever you please. Find an efficient algorithm to pick the talks to attend.
The last part is critically important: it means that you can choose overlapping intervals to satisfy the problem.
In case it is not fully clear, here is an example for what the input data might look like. Let $L$ be the conference length, and $T$ the time-interval of each talk, and $S$ the stressfulness of each talk.
$L = 600$
Talk 1: $T=(0,400), S=15$
Talk 2: $T=(0,50), S=2$
Talk 3: $T=(50,300),S=8$
Talk 4: $T=(250,450),S=10$
Talk 5: $T=(400,500),S=4$
Talk 6: $T=(450,600),S=5$
You can choose Talk 1, Talk 5, and Talk 6 totalling $S=24$. The next-best alternative would be Talk 2, Talk 3, Talk 4, and Talk 6 at a total cost of $S=25$. Consequently, you choose to attend Talk 1, Talk 5, and Talk 6.
First, sort the talks by ending times and and denote by $L(x,y)$ the minimum stressfulness that can be achieved considering only the first $x$ talks if the conference lasted for $y$ minutes, that is, we pretend that the conference lasts for $y$ minutes and that only the first $x$ talks exist and try to solve a "smaller" version of the main problem. If all the talks up to the $x-$th one end before the $y-$th minute, we set $L(x,y)$ to infinity to indicate this impossibility.
The algorithm roughly goes as follows:
At first, we set all the entries in the matrix $L$ to infinity, except those in the zeroth column which are zero (the minimum stressfulness is $0$ if the conference lasts for $0$ minutes).
We process the sorted talks one by one, updating the matrix $L$ row by row. When at row/talk $x$, we update the entries $L(x,y)$ for $y$ up to $y=talk(x).end$. Those beyond $y=talk(x).end$ remain set to infinity because there is no talk among the first $x$ ones that ends after minute $y=talk(x).end$.
When trying to determine the value of $L(x,y)$, we have two options: either to attend the $x-$th talk or not, whichever is better. If we decide not to attend the $x-$th talk, then $L(x,y)=L(x-1,y)$. Otherwise, we should search through all $L(x-1,z)$ with $z$ ranging from $talk(x).start$ to $y$ and find $M$, the minimum value of $L(x-1,z)+talk(x).stress$. What we are trying to do is to "concatenate" the $x-$th talk to a group of talks which involves some of the first $x-1$ ones of which hopefully at least one ends after the $x-$th talk starts. Finally, we set $L(x,y)=min\{L(x-1,y),M\}$.
At the end, the value we are looking for is $L(number-of-talks, n)$.
I hope this helps!
Update: I tried to implement the algorithm in C++. You can find it here and test it by compiling it here.