Estimating expected loss given a non-typical probability density

26 Views Asked by At

Let X be the time in which a printer functions. The printer costs 200 USD.

The density function of X is

f(x) = kx for 0 < x < 5 (k is some constant) or 0 otherwise.

The manufacturer repays the whole price if the printer stops functioning during the first year and half the price if it breaks during the second. What sum is the manufacturer expected to repay if they have sold 100 printers?

I apologize for the problem question but this was given during an exam and I don't think the material during lectures covers it.

1

There are 1 best solutions below

1
On

The expected amount the manufacturer pays per printer if $f(x)$ is the density of the time the printer lasts (where $x$ is measured in years) is $$ 200 \int_0^1 f(x) dx + \frac{200}{2} \int_1^2 f(x) dx$$. The first integral is the probability the printer lasts less than a year, the second is between 1 year and 2 years.

Multiplying this by 100 gives the expected cost they have to repay if they sell 100 printers:

$$ 20000 \int_0^1 f(x) dx + 10000 \int_1^2 f(x) dx$$

Note that you can find what $k$ is by noting that $\int_{-\infty}^\infty f(x) dx= 1$ and solve this problem.