I've been trying to solve this problem :
Audrey 4 hours to complete a certain job. Ferris can do the same job in 3 hours. Audrey and Ferris decided to collaborate on the job, working at their respective rates. While Audrey worked continuously, Ferris took 3 breaks of equal length. If the two completed the job together in 2 hours, how many minutes long was each of Ferris’ breaks ?
My attempt was to consider the duration of Ferris' each individual break as 'b' hours. Hence, 3 breaks would contribute to 3b hours. Now, since this is the additional time that Ferris takes, therefore the rate of his work drops from 1/3 jobs/hour to 1/(3 + 3b) jobs/hour.
Hence, their combined rate would be (1/4) + 1/(3 + 3b) jobs/hour. Using this I'm arriving at a result of b = 20 minutes, which is wrong. Could anyone please explain why this approach is wrong?
This is wrong implication.
You can solve this problem simply by: