A team of software engineers are testing the time taken for
A team of software engineers are testing the time taken for a particular type of modern computer to execute a complicated algorithm for factoring large numbers. They would like to estimate the mean time taken for a computer to execute the algorithm.
A random sample of 41 times are collected. The mean time in this sample is 898.6 seconds and the sample standard deviation is found to be 83.9.
Calculate the 95% confidence interval for the mean time taken to execute the algorithm. You may find this Student\'s t distribution table useful. Give your answers to 2 decimal places.
? ? ?
Solution
The degree of freedom =n-1=41-1=40
Given a=0.05, t(0.025, df=40)=2.02 (from student t table)
So the lower bound is
xbar -t*s/vn =898.6 -2.02*83.9/sqrt(41) =872.13
So the upper bound is
xbar +t*s/vn =898.6 +2.02*83.9/sqrt(41) =925.07
