Simulate the readout noise associated with a pixel in an image detector assuming an offset of 100 electrons and a readout noise standard deviation of 6 electrons, by generating a Gaussian random variable with mean 100 and variance 36. How many data samples do you need in your simulation so that the estimate of the mean is within 10%, 5%, and 1% of the actual mean of 100 electrons? Similarly, how many data samples do you need in your simulation so that the estimate of the variance is within 10%, 5%, and 1% of the actual variance of 36 electrons squared?
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here