Let X and S be the sample mean and sample standard deviation
     Let X and S be the sample mean and sample standard deviation, respectively, of a random sample size n from a normal distribution N(mu, sigma). The following random variable,  T = X - mu/2/square root n has a T-distribution with n-1 degrees of freedom  Using the T-Table, find the constant c so that for a sample of size n = 25:  With c as defined above and if X and s are observed values of X and S respectively, how would you interpret the interval X plusminus CS ? 
  
  Solution

