The famous Y2K bug occurred because some old programs used two decimal digits to store years. This became a problem in the year 2000, because such programs had no way to tell whether “00” meant 1900 or 2000. A similar problem will occur for Java programs when the number of milliseconds since the beginning of 1970 exceeds the capacity of a long. In what year will this occur, given that the maximum value of a long is 9,223,372,036,854,775,807? What if getTime() returned an int, which has a maximum value of 2,147,483,647? What about those UNIX/C systems which use an int to store the number of seconds since the beginning of 1970?
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here