According to a research study, American teenagers watched 14.1 hours of television per week last year, on average. A random sample of 11 American teenagers was surveyed and the mean amount of time per week each teenager watched television was 13.3. This data has a sample standard deviation of 1.5. (Assume that the scores are normally distributed.)
Researchers conduct a one-mean hypothesis test at the 1% significance level, to test if the mean amount of time American teenagers watch television per week is different than the mean amount of time last year?
(a) Which answer choice shows the correct null and alternative hypotheses for this test?
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here