A tech company develops a navigation app for smartphones that can compute the usual distance it takes to get from one location to another. The company collects location data from 100 smartphones to determine how long it takes to drive from Cleveland, Ohio, to Detroit, Michigan.
The company finds that it takes an average of 2.78 hours to drive this distance with a standard deviation of 0.06. The driving times appear to be normally distributed.
This company wants to provide an estimate of a range of driving times that include the driving times for 95% of users.
What would this range be?
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here