Assume that all of your job applicants must take a test, and that the scores on this test are normally distributed. The “selection ratio” is the cutoff point you use in your hiring process. For example, a selection ratio of 20% means that you will accept applicants for jobs who rank in the top 20% of all applicants. If you choose a selection ratio of 20%, the average test score of those selected will be 1.40 standard deviations above average of all applicants. Use simulation to verify this fact, proceeding as follows.a. Show that if you want to accept only the top 20% of all applicants, you should accept applicants whose test scores are at least 0.842 standard deviation above average. (No simulation is required here. Just use the appropriate Excel normal function.)b. Now generate 1000 test scores from a normal distribution with mean 0 and standard deviation 1. The average test score of those selected is the average of the scores that are at least 0.842. To determine this, use Excel’s DAVERAGE function. To do so, put the heading Score in cellA3, generate the 1000 test scores in the rangeA4:A1003, and name the range A3:A1003 Data. In cells C3 and C4, enter the labels Score and 0.842. (The range C3:C4 is called the criterion range.) Then calculate the average of all applicants who will be hired by entering the formula DAVERAGE(Data, “Score”, C3:C4) in any cell. This average should be close to the theoretical average, 1.40. This formula works as follows. Excel finds all observations in the Data range that satisfy the criterion described in the range C3:C4 (Score 0.842). Then it averages the values in the Score column (the second argument of DAVERAGE) corresponding to these entries. Look in online help for more about Excel’s database functions.c. What information would you need to determine an “optimal” selection ratio? How could you determine an optimal selection ratio?
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here