In a 9-bit 1-bit-per-stage pipelined A/D converter with a full-scale input range of , what is the maximum tolerable offset in the first stage comparator while ensuring the output has less than 1/2 LSB overall error? If the converter is changed to a 1.5-bit-per-stage architecture with error correction, how much offset can be tolerated in the first stage comparators? You may assume that the 2nd and all subsequent stages are ideal.
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here