Having trouble understanding following question.
A simple model of how a virus spreads looks like this: Every person who becomes infected has a 1-week incubation period before the virus breaks out. During that week the infected person meets a number of people, and for each meeting there is a 10 % chance of infection. After that week, the person at home is ill and no longer infected.
Let t = 0 denote the time of the outbreak, ie. then the first person becomes contagious with a new type of flu.
Then set up a recursion that describes the number of people who are expected to be contagious week “t”.
Then use recursion to find how many people are infected up till week 10 given that a "normal" person meets 10 people a week.
My questions:
- Does a 10% chance means it is increasing by 10% each week? so $a_{t} = a_{t-1} + 1.1\times a_{t-1} , a_{0} = 1$?
- First person is no longer infected after 1 week. How can i deal with this?