A computer generates random numbers on the interval 01 accor
A computer generates random numbers on the interval (0,1) according to a uniform distribution on this interval. If 100 such random numbers are generated, what\'s the probability that the mean of these 100 numbers is greater than .53? MUST SHOW WORK and reasoning, CANNOT be done using software. I believe its an application of the central limit theorem.
Solution
Note that here,          
           
 a = lower fence of the distribution =    0      
 b = upper fence of the distribution =    1      
           
 Thus, the mean, variance, and standard deviations are          
           
 u = mean = (b + a)/2 =    0.5      
 s^2 = variance = (b -a)^2 / 12 =    0.083333333      
 s = standard deviation = sqrt(s^2) =    0.288675135  
We first get the z score for the critical value. As z = (x - u) sqrt(n) / s, then as          
           
 x = critical value =    0.53      
 u = mean =    0.5      
 n = sample size =    100      
 s = standard deviation =    0.288675135      
           
 Thus,          
           
 z = (x - u) * sqrt(n) / s =    1.039230483      
           
 Thus, using a table/technology, the right tailed area of this is          
           
 P(z >   1.039230483   ) =    0.149348778 [ANSWER]
   

