Suppose that the mean GRE score for the USA is 500 and the s
Suppose that the mean GRE score for the USA is 500 and the standard deviation is 75. Use the Empirical Rule (also called the 68-95-99.7 Rule) to determine the percentage of students likely to get a score between 350 and 650? What percentage of students will get a score above 500? What percentage of students will get a score below 275? Is a score below 275 significantly different from the mean? Why or why not?
Solution
Let X be a random variable denoting the GRE score of a student.
X~Normal(500,752)
the percentage of students likely to get a score between 350 and 650=P(350<X<650)*100=(P(X<650)-P(X<350))*100
=(.977250-0.0227501)*100=0.954500*100=95.45%
percentage of students will get a score above 500=0.5*100=50%
percentage of students will get a score below 275=P(X<275)*100=0.0013499*100=0.13%
a score below 275 is significantly different from the mean since the probability of getting a score less than 275 is 0.0013499 that is quite small.
