Most health applications use exponential decay for estimating the death rate. This assumes at every age, there is an equal chance of dying. Is there a simple way to slightly improve this assumption by centering the death rate based on a known half-life (defined as the time when half of the population is gone)?
For example, if the expected age is known, then the younger agent (age < expected age) has a lower chance of dying, and the older agent (age > expected age) has an increased chance of dying. However, this new distribution should be so that the expected mean does not change.
I could use an age structure model or build a linear age-adjusted death distribution, but I'm sure something like this is out there. I just don't know exactly the terms to search for.