Skip to content

Instantly share code, notes, and snippets.

@skipcloud
Last active January 28, 2020 09:26
Show Gist options
  • Save skipcloud/c23dd98aec6eaa27fe009d3ab30c42f3 to your computer and use it in GitHub Desktop.
Save skipcloud/c23dd98aec6eaa27fe009d3ab30c42f3 to your computer and use it in GitHub Desktop.
Some information about computers, numbers, and dates.

Good morning! Couldn't think of a tip for you but still wanted to write something to spread some knowledge. This is about how computers handle numbers, specifically how it affects dates. This might be a little waffle-y.

Do you remember the Y2K problem? To understand why this was an issue we need to know a little about computers back when they were in their infancy, the 50s and 60s, when memory was at a premium. In 1965 the PDP-7 computer[1] had a whopping 9KB of memory, which could be expanded to a eye watering 144KB. As we got into the 70s memory improved but not by much, this meant that when programs were being written they had to be as economical as possible when it came to memory. One way of doing that is to store dates as two digits 00-99, after all this won't be an issue in the future because computers will improve and this code won't be running then, right? If you know anything about programming you know there is a lot of code running in production that has been there a long time.

The issue here is as you reach 99 then the year 2000 will be represented as 00, but does that mean 2000 or 1900? A human can figure it out from context but computers are quite stupid. People were aware of this issue well ahead of time by the way, people born in the 1850s were still alive when computers were gaining steam in the 1950s. Pension systems ran into issues when they were dealing with a customer who was apparently just born. Computers weren't just the issue here, people were putting lines in their code like if date > 85 then pay_pension. Just makes you wonder what code you have written that isn't future proof. The reason Y2K wasn't an issue (or a big one should I say, stuff still went wrong) was people knew well ahead of time that this was coming and worked to fix it. Some people think Y2K was some sort of hoax, and those people are idiots.

This brings me onto the next topic I wanted to tell you about, ever heard of the Year 2038 problem? To understand this one you need to know how computers store and represent numbers in binary. Imagine we have a 4 bit computer, it is capable of handling only 4 bits - 0000. There are a couple of ways you can represent an integer in a computer, have 0000 represent zero then count up from there 0001 == 1, 0010 == 2, 0011 == 3 etc. All the way up to 1111 which equals 15. Now, if you want to represent a negative number you can't do it with this technique, you could however sacrifice the highest bit (this one here -> 1000) and use that as a sign to tell the computer whether it is a positive or negative number. These are called "signed" numbers.

If the highest bit is a 1 then it's a negative number. One issue with signed numbers is you lose that bit, so the highest number you can represent is much smaller. If you're able to represent 0 up to the number 15 with a 4bit unsigned integer, then you will only be able to represent -8 (1000) to 7 (0111) with a signed integer. The negative numbers would proceed like this: 1000 == -8, 1001 == -7, 1010 == -6, all the way up to 1111 which equals -1. It is done this way so you can have 0000 represent zero, and you can use Two's complement[2] to get the negative of a number which is a lot easier for a computer to do than using One's complement[3]. If you are familiar with counting in binary then when you get all the way up to 0111 the next number is 1000, which means the number jumps from 7 to -8. This is what's known as integer overflow.

The next thing I need to mention is the fact computer architecture is based around how many bits a word contains. A "word" in this context is a sequence of bits, for example a "word" in an 8bit computer is 8 bits long 01001101, its registers and CPU etc can work with words of 8 bits in length. These days we typically see 64bit architecture, there are even 128bit computers kicking around. But until 64bit computers became the norm it was 32bit computers that were all you could reasonably afford. The highest number you can represent in a signed 32 bit integer is 2,147,483,647.

And to wrap it all up: these days computers use the Unix Epoch[4] to figure out the date, the epoch is the number of seconds that have elapsed since January 1st 1970. Given that today 1,580,201,038 seconds have elapsed since the epoch a computer can tell you it's Tues 28th of January 2020. I guess you're smart enough to know where this is going. By the time we reach 03:14:07 on Tuesday, 19 January 2038 any 32bit computer holding the seconds since unix epoch in an signed integer will overflow and the computer will think its 1901.

Hope you found that interesting enough :) happy Tuesday!

[1] https://en.wikipedia.org/wiki/PDP-7

[2] https://en.wikipedia.org/wiki/Two's_complement

[3] https://en.wikipedia.org/wiki/Ones%27_complement

[4] https://en.wikipedia.org/wiki/Unix_time

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment