Why do we often only need the N01 distribution when we const

Why do we often only need the N(0,1) distribution when we construct confidence intervals for the population mean(Xbar), the variance was known?
Why do we often only need the N(0,1) distribution when we construct confidence intervals for the population mean(Xbar), the variance was known?

Solution

Because, using Central Limit Theorem one can approximate the distribution of ( Y - µ ) / ( / n ) to N ( 0 , 1 ) as n approaches infinity.Where Y is the population mean.
For example : In order to compute Pr ( X < 40 ) where X follows Binomial ( 200, ,5 ) would require us to work 40 probabilities and then add them all up. Hence , using Central limit theorem makes the work easy.

Central limit theorem states that for a sequence of iid (independent and identically distributed ) random variables with finite mean and finite non-zero variance. The distribution of ( Y - µ ) / ( / n ) approaches to N ( 0 , 1 ) as n approaches infinity.Where Y is the population mean.

Why do we often only need the N(0,1) distribution when we construct confidence intervals for the population mean(Xbar), the variance was known? Why do we often

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site