probability integral transformation and distribution of P= P[ |T| <= |t|] .

171 Views Asked by At

The task is to find the distribution of P. where , P=P[ |T| <= |t|]. (T is a continuous random variable with PDF f(t)).

now , I tried to make the following two arguments :

1.P= P[ |T| <= |t|] ( the range of P is (0,1))

= P[ U <= |t|] (where U=|T|, a continuous random variable with range (0,1) )

= F(|t|) (F(.) being the CDF of U)

so, by probability integral transformation F(|t|)follows uniform (0,1) distribution. (as in probability integral transformation the distribution of F(x) doesn't depend on the value of x.so we need not to worry about the '|t|' thing .)

2.P= P[ |T| <= |t|] ( the range of P is (0,1))

= P[-|t| <= T <= |t|] ( the range of P is (0,1))

= P[-t<=T<= t]

= G(t)-G(-t) = X-Y (say)

but , X and Y (again by probability integral transformation ) follows uniform (0,1) distribution each . and then, the range of X-Y is (-1,1).whereas P can't be negative. moreover X-Y doesn't follow uniform distribution.(in fact, no distribution over (0,1)). we can calculate that the PDF of V =X-Y is

f(v) = 1- |v| (when -1< V < 1 ) and o (otherwise) .

conclusion : i think in the 2nd argument what i wrote that "(X-Y) has a range of (-1,1)" ,was wrong. Because here X and Y are ,though identical, NOT INDEPENDENT. and, that's the source of the fallacy . Am i correct ?

have I done any other logical mistake ?