Consider a situation where there is a cost that is either incurred or not. It is incurred only if the value of some random input is less than a specified cut-off value. Why might a simulation of this situation give a very different average value of the cost incurred than a deterministic model that treats the random input as fixed at its mean? What does this have to do with the “flaw of averages”?
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here