How to perform mathematical operations using mean and standard deviation.

1.4k Views Asked by At

It's given that a particular parameter, say base-time $T_b$ follows a lognormal distribution with the mean of $10$ years and the standard deviation of $5$ years. Now, how do I estimate the value for another parameter `target-para $X$ which depends on 'base-time' as per the following equation:

$$X = K_1 + K_2(t - T_b)$$

where, $X$ is the parameter which is to be estimated, $K_1$ and $K_2$ are two constants, $t$ is the time in years, and $T_b$ is 'base-time'.

1

There are 1 best solutions below

1
On BEST ANSWER

The expectation is

$$E[X]=K_1+K_2(t-10).$$

The variance is $$\sigma_X^2=E[(X-E[X])^2]=E[\{K_1+K_2(t-10)-(K_1+K_2(t-T_b)\}^2]=$$$$=K_2^2E[(10-T_b)^2]=25K_2^2.$$

So, the standard deviation

$$\sigma_X=5K_2.$$