Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

@thanatos
Last active August 3, 2023 10:24
Show Gist options
  • Star 18 You must be signed in to star a gist
  • Fork 2 You must be signed in to fork a gist
  • Save thanatos/eee17100476a336a711e to your computer and use it in GitHub Desktop.
Save thanatos/eee17100476a336a711e to your computer and use it in GitHub Desktop.
Falsehoods Programmers Believe About Time

Intro & Ground-rules

See Falsehoods programmers believe about time.

I'm going to answer these with either true or false, and an explanation. Of course, they're all supposed to be falsehoods. I'll be answering a few with true, but most you'll find I'm going to merely provide the reason why it is a falsehood.

Note that we will allow leap seconds. It'll get noted when this is the case.

Last, all of these are to the best of my knowledge. I'll try to explain my thinking, but if you think I'm wrong, or know something I don't, comment as such. Preferably, back it up with citations!

The Falsehoods

  1. There are always 24 hours in a day.

Maybe, but probably false. It depends on what we call an hour. If we allow for some hours to not be exactly 60 * 60 seconds long (such as "hours" with a leap second, but we could still consider it an hour), then yes, in UTC / the Gregorian calendar.

Local times don't work here, because spring-forward and fall-back times make the days 23 and 25 hours long.

And if you're on another planet, then your definition of the day might be even weirder.

  1. Months have either 30 or 31 days.

False. February.

  1. Years have 365 days.
  2. February is always 28 days long.

False, to both. Leap years and not leap years, to both.

  1. Any 24-hour period will always begin and end in the same day (or week, or month).

False, but I'm not sure I understand the assumption here. Clearly, a 24 hour period straddling a day cannot start and end in the same day, and such simple counter examples break the rest too.

  1. A week always begins and ends in the same month.

False.

  1. A week (or a month) always begins and ends in the same year.

False, for the week, but True for the year, but only within our assumption. See the link for where this isn't true outside the assumption.

  1. The machine that a program runs on will always be in the GMT time zone.

False. My machine isn't.

  1. Ok, that’s not true. But at least the time zone in which a program has to run will never change.

False, for many programs that need to deal with input in a multitude of timezones. Such as Google Calendar.

If we still mean the machine the program is running on, the user can always pop open their preferences and change their timezone. Something is running at that point, even if it is the clock in the corner of your screen.

  1. Well, surely there will never be a change to the time zone in which a program hast to run in production.

False.

  1. The system clock will always be set to the correct local time.

False. I run a lot of machines in UTC, that aren't anywhere near Greenwich.

But let's assume you meant that the clock is just set correctly. Clock drift is a thing, and I've seen drifts well over a minute.

  1. The system clock will always be set to a time that is not wildly different from the correct local time.

Define "wildly different".

  1. If the system clock is incorrect, it will at least always be off by a consistent number of seconds.

False. The very nature of clock "drift" is that it is a slow and gradual effect. The longer you let an imprecise clock run without adjusting it, the more it will drift off the actual value.

  1. The server clock and the client clock will always be set to the same time.

False. What is the "same time": to what degree of accuracy?

  1. The server clock and the client clock will always be set to around the same time.

False.

  1. Ok, but the time on the server clock and time on the client clock would never be different by a matter of decades.

False. Inside most desktop computers, there is a small watch battery. Among other things, it keeps the clock alive, and ticking. These batteries do die. When they do, the clock can stop ticking when the machine is off. It might also just reset to something like 1970.

  1. If the server clock and the client clock are not in synch, they will at least always be out of synch by a consistent number of seconds.

False, mostly a combination of #13 and #11.

  1. The server clock and the client clock will use the same time zone.

False. There are people all over the country, let alone the world, on the Internet.

  1. The system clock will never be set to a time that is in the distant past or the far future.

False, see #16 for "distant past", and see a local troll for the distant future.

  1. Time has no beginning and no end.

False. The big bang, and follow the link for the end.

  1. One minute on the system clock has exactly the same duration as one minute on any other clock

False. Not all clock drifts are the same. Atomic clocks are remarkably accurate, which is why we build them. My Macbook, without the crutch of an NTP server, is not so accurate.

  1. Ok, but the duration of one minute on the system clock will be pretty close to the duration of one minute on most other clocks.

False. Again with the "pretty close". An NTP clock adjustment to that imprecise Macbook will prevent this from being true.

  1. Fine, but the duration of one minute on the system clock would never be more than an hour.

But by now, I hope you see that there's probably a better solution to your problem.

I honestly can't see when this would be false, honestly. Maybe if we take that machine with the depleted CMOS battery and keep rebooting it such that it never leaves 1970.

Or my VCR, which is still flashing 12:00.

  1. You can’t be serious.

Kids these days. Don't know what a VCR is. Or how to program one.

  1. The smallest unit of time is one second.

False, because

  1. Ok, one millisecond.

False. Nanoseconds. (And don't stop there.)

  1. It will never be necessary to set the system time to any value other than the correct local time.

False. A lot of servers run UTC, because it is more useful than some very arbitrary local time.

  1. Ok, testing might require setting the system time to a value other than the correct local time but it will never be necessary to do so in production.

False. That last example was production…

29. Time stamps will always be specified in a commonly-understood format like 1339972628 or 133997262837.

False. I personally dislike these. Do you know what times those represent? No, you don't. An ISO-8601 YYYY-MM-DDTHH:MM:SS is so much more readable.

  1. Time stamps will always be specified in the same format.

False. We just mentioned two above.

  1. Time stamps will always have the same level of precision.

False. Some things store it in seconds. Some in nanseconds. Some in microseconds, some in milliseconds…

  1. A time stamp of sufficient precision can safely be considered unique.

False. Two machines get really lucky.

  1. A timestamp represents the time that an event actually occurred.

False. Generating a timestamp takes time. Your clock can also be wrong.

  1. Human-readable dates can be specified in universally understood formats such as 05/07/11.

False, while I figure out if that's the fifth of July or the seventh of May. Or the seventh of November. Or the 11th of July.

Intro & Ground-rules

See More falsehoods programmers believe about time; “wisdom of the crowd” edition.

I'm going to answer these with either true or false, and an explanation. Of course, they're all supposed to be falsehoods. I'll be answering a few with true, but most you'll find I'm going to merely provide the reason why it is a falsehood.

I'm going to assume the programmer is allowed to use a Proleptic Gregorian calendar. Things get insane if you can't: different areas adopted the Gregorian calendar at different times, so determining the local date depends on where you were (of course, it does with timezones too, but this just adds more fuel to the fire). Also, very, very early in time (around the BC to AD transition), historians aren't really sure when the leap years were, I think I read that this is mostly due to them being inserted or not inserted for political reasons. (You're shocked, I know.)

Note however, you need to validate that this assumption holds in your programming environment. As apaprocki on HN points out, it doesn't in Java:

[…] but keep in mind non-Proleptic can still be found. Java splits on 15 Oct 1582 by default:

https://docs.oracle.com/javase/7/docs/api/java/util/Gregoria...

If you follow that, you'll get:

public void setGregorianChange(Date date)

Sets the GregorianCalendar change date. This is the point when the switch from Julian dates to Gregorian dates occurred. Default is October 15, 1582 (Gregorian). Previous to this, dates will be in the Julian calendar.

To obtain a pure Julian calendar, set the change date to Date(Long.MAX_VALUE). To obtain a pure Gregorian calendar, set the change date to Date(Long.MIN_VALUE).

Parameters:

date - the given Gregorian cutover date.

Note, however, that we will allow leap seconds. It'll get noted when this is the case.

I'll try to note where I depend on this assumption. All of these are to the best of my knowledge. I'll try to explain my thinking, but if you think I'm wrong, or know something I don't, comment as such. Preferably, back it up with citations!

The Falsehoods

  1. The offsets between two time zones will remain constant.

False: Timezone rules (and thus offsets) change _all the time. IANA maintains the timezone database, and as of this writing in April of 2015, the current version is 2015b, meaning there have been two changes this year already. And lest you think this is only small countries changing, the United states revised their DST dates in 2005.

  1. OK, historical oddities aside, the offsets between two time zones won’t change in the future.

False: We proved this in the first one's argument.

  1. Changes in the offsets between time zones will occur with plenty of advance notice.

Vague, but let's say False. First, what's advance notice? We can quibble with the defintion. Russian made a fairly quick change recently I think, whereby the law passed in August, the change was made in October, and Microsoft had a patch in September, so judge for yourself. Where governments are involved, I'm going with "plenty of advance notice" is probably not a gaurantee.

  1. Daylight saving time happens at the same time every year.

False. We've established that the rules change in #1.

  1. Daylight saving time happens at the same time in every time zone.

False. "same time" is debatable here: The entire US (okay, okay, the bits that observe DST! And you know what bits I mean!) starts DST at "2 am". But that's 2am local, and thus, across the four major continental timezones, that's four _different 2 AMs. Of course, other countries don't follow suite; one of the reasons we have the database mentioned in #1 is that nobody can agree on the rules.

  1. Daylight saving time always adjusts by an hour.

True? Being on the list, I doubt my answer here, but I don't know of a case. Since I'm sure the author isn't bluffing, I'm going with he knows something I don't.

  1. Months have either 28, 29, 30, or 31 days.

True, but only within our assumption of a Proleptic Gregorian calendar. When the Gregorian calendar was adopted in Britian,

it was necessary to correct by 11 days. Wednesday, 2 September 1752, was followed by Thursday, 14 September 1752.

So, that particular month was obviously well short of the normal length. Of course, that's just Britian. As mentioned, others adopted at different times.

  1. The day of the month always advances contiguously from N to either N+1 or 1, with no discontinuities.

True, but only within our assumption. The discussion of #7 includes an example of a discontinuity.

  1. There is only one calendar system in use at one time.

Vague. True, under our assumption, but false if we humor the author. Our assumption defines the calendar system in use — we're clearly cheating our way through this question. As mentioned in #7, the Gregorian calendar system was adopted at different times in different places (and some places had riots around the adoption). Religions can also involve other calendar systems.

  1. There is a leap year every year divisible by 4.

False. The link is in the original text. See the Gregorian calendar here:

The Gregorian reform modified the Julian calendar's scheme of leap years as follows:

Every year that is exactly divisible by four is a leap year, except for years that are exactly divisible by 100, but these centurial years are leap years if they are exactly divisible by 400. For example, the years 1700, 1800, and 1900 are not leap years, but the year 2000 is.

Our computer revolution falls at a bad (or good?) time here: the "divisible by 4" "rule" "works" from 1901 to 2099. The year 2000 would have been an exception, but the exception-to-the-exception saved us! But 1900 is not a leap year, and I can bet you all sorts of software will be breaking in 2100.

  1. Non leap years will never contain a leap day.

True, but only within our assumption. See the link from where we made our assumption about when the leap years were for very early years for a counter example, but I don't feel like this counts for much.

  1. It will be easy to calculate the duration of x number of hours and minutes from a particular point in time.

False. Within our assumption, leap seconds ruin this. Leap seconds are derived from measurements of the earth's movement, and this movement is not predictable. They're announced beforehand, but not by a whole lot. Thus, offsets into the future can change when leap seconds are allowed.

If this question means something like "what is the datetime x + the offset y?", even just the rules of the Gregorian calendar don't make this easy. Leap years, months with different lengths, yikes! Often times, the offset too is problematic. What is "today + 1 month"? Is a month a constant number of days? Do you just increment the number of the month, modulo 12? (but then, what is Jan 31 + 1 month? Feb 31?)

  1. The same month has the same number of days in it everywhere!

True, but only within the assumption. Again, the Gregorian reform.

  1. Unix time is completely ignorant about anything except seconds.

Unix time,

Unix time (also known as POSIX time or Epoch time) is a system for describing instants in time, defined as the number of seconds that have elapsed since 00:00:00 Coordinated Universal Time (UTC), Thursday, 1 January 1970, not counting leap seconds.

I'm going mostly with true. Note the next few lines,

it is neither a linear representation of time nor a true representation of UTC

I usually think of Unix time as a duration. It's "the number of seconds" (a duration) since a fixed point — which thus gives you time, with huge amends to the note above.

  1. Unix time is the number of seconds since Jan 1st 1970.

True. See #14. I will note: since midnight Jan 1st 1970 UTC.

  1. The day before Saturday is always Friday.

True. Definitely within the assumption, I think, but even outside, I'm not sure when this wouldn't be, honestly. Fill me in?

  1. Contiguous timezones are no more than an hour apart. (aka we don’t need to test what happens to the avionics when you fly over the International Date Line)

False. Well, there IDL represents one example. There's a 3 hour jump between the border of China and Pakistan, mostly due to China being a single timezone.

  1. Two timezones that differ will differ by an integer number of half hours.

False. Some timezones are offset from UTC by 15 minute intervals. See, for example, [the westernmost third of Australia](http://en.wikipedia.org/wiki/Time_zone#/media/File:World_Time_Zones_Map.png).

  1. Okay, quarter hours.

False, at least historically. See for example UTC−00:25:21. I think at the time of this writing, quarer hours is true, for present-day timezones. So if you have no historical data… and the laws never change…

  1. Okay, seconds, but it will be a consistent difference if we ignore DST.

Vague. I'm not sure what "consistent difference" means here.

  1. If you create two date objects right beside each other, they’ll represent the same time. (a fantastic Heisenbug generator)

False, of course, and for the reasons stated. Assume a clock with second granularity (ha!): creating the object must consume some CPU, so if you two in a row, at some point, you're going to get lucky, and the next one will be on (at least) the next second. Most clocks have much better resolution, of course, and they're probably only going to get better.

  1. You can wait for the clock to reach exactly HH:MM:SS by sampling once a second.

False, and whats "exactly" here? A second is a whole second long. (Well, usually.) System clocks can very easily skip right over your desired time: clock adjustments from NTP servers, the machine was asleep, hibernated, off, or your process just didn't get CPU time, or even leap seconds can all cause you to miss your desired timestamp. Just try for the best, and work with what you get.

  1. If a process runs for n seconds and then terminates, approximately n seconds will have elapsed on the system clock at the time of termination.

False, see #22, as most of the reasons apply here. Did I mention you can get no CPU time?

  1. Weeks start on Monday.

False. According to ISO-8601, the week starts on Monday. Of course, in my country, the week starts on Sunday. The world seems to agree that it's either Saturday, Sunday or Monday. See Week.

  1. Days begin in the morning.

False. If you're on a pole, the sun might never set or rise in certain parts of the year, so what is "morning"?

  1. Holidays span an integer number of whole days.

I don't actually know this one, but I'm sure someone has an example, and if not, I'm sure someone will invent one.

  1. The weekend consists of Saturday and Sunday.

I don't actually know this one either. In my culture, this is true, but it wouldn't shock me.

  1. It’s possible to establish a total ordering on timestamps that is useful outside your system.

True, but only within our assumption, even then, it is tenuous. And really, I think it depends on your timestamps and what you mean by total ordering. UTC timestamps should be orderable, but they're only as accurate as the machine that stamped the time, and I have see clock drifts of well over a minute. That might matter to you. If you have Unix timestamps, this isn't true. Remember that weird note?

it is neither a linear representation of time

Unix timestamps will repeat during a leap second, so two timestamps within those leap seconds can't be reliably ordered, because we don't know which of two actual instants in time the timestamp refers to. As another example, consider having a local time of 2:15am and 2:20am during a DST fallback from 3am to 2am. If I don't tell you which or both of 2:15 and 2:20am were before or after the fallback, you cannot order them. You can defeat the DST fallback problem by recording which side of the fallback you're on (and some libraries do this).

  1. The local time offset (from UTC) will not change during office hours.

Unanswerable. I'm not even going to guess what your office hours are. (This probably means it is a falsehood then, if you belive it…)

  1. Thread.sleep(1000) sleeps for 1000 milliseconds.

False: this runs into the same issues as #22. Also, how precise is your clock? Is it perfectly precise? Because if not, we can quibble about what 1000 milliseconds is.

  1. Thread.sleep(1000) sleeps for >= 1000 milliseconds.

False. Clocks are imprecise, and if your clock is fast, then this won't hold. (It occurs I could perhaps throw relativistic effects into the discussion here, but let's not.)

  1. There are 60 seconds in every minute.

False. Leap seconds can make this 59 or 60. Clock adjustments can mess with you further.

  1. Timestamps always advance monotonically.

False: have I mentioned clock adjustments?

  1. GMT and UTC are the same timezone.

True, according to the IANA timezone database! There's some historical differences, of course, and the names are different.

  1. Britain uses GMT.

False. Britian observes British summer time, which is their name for DST, during which they're offset from GMT.

  1. Time always goes forwards.

False. Well, true in the real world, so far as we know. Clock adjustments, once again, can mess with you. POSIX timestamps during a leap second insertion will go backwards.

  1. The difference between the current time and one week from the current time is always 7 * 86400 seconds.

False. In UTC, leap seconds, again. In a local time, DST spring-forward and fall-back will ruin this as well. Rule adjustments in a timezone will also mess around with this.

  1. The difference between two timestamps is an accurate measure of the time that elapsed between them.

False. How accurate are your clocks? Are you using Unix timestamps, which as we discussed above, do funny stuff during leap seconds?

  1. 24:12:34 is a invalid time

Eh. Some people seem to think this is valid. This gets used to attach a time to a particular day that isn't really part of a particular day. ISO-8601 allows 24:00, but only that, I believe — i.e., the example in the article isn't valid within ISO-8601.

  1. Every integer is a theoretical possible year

False. The universe began at some point, so there exists negative integers for which this isn't true. (Or at least, we have no idea whether it is true.) The universe might end someday. Also, year 0 didn't occur on some calendars.

  1. If you display a datetime, the displayed time has the same second part as the stored time

I don't know this one. This depends on so many things, such how you stored the time, and how you're displaying the time.

  1. Or the same year

False: Look to ISO week numbers for an example. For example, 2009-W01-1 corresponds to 2008-12-29. Those are both ISO-8601 dates, and they represent the same thing, but have different year components.

Also, if your internal stored format is UTC (which is common), and you display that timestamp as a local time, timezone adjustments can of course throw it into the next or previous year.

  1. But at least the numerical difference between the displayed and stored year will be less than 2

Well, the difference between 1 BC and 1 AD (adjancent years) is exactly two in integral form, and the question stipulates "less than 2". Don't really know here.

  1. If you have a date in a correct YYYY-MM-DD format, the year consists of four characters

False, once you hit the year 10000 AD.

  1. If you merge two dates, by taking the month from the first and the day/year from the second, you get a valid date

False. 1 Feb, 31 Jan31 Feb.

  1. But it will work, if both years are leap years

False, same example. (Just append a leap year to both dates: you still end up with 31 Feb.)

  1. If you take a w3c published algorithm for adding durations to dates, it will work in all cases.

I don't know here. I don't know what the “w3c published algorithm” is, and I'm going to leave it as an exercise to you, reader, to look it up. It's late here.

  1. The standard library supports negative years and years above 10000.

False. I usually work in Python, and according to the standard library's documentation:

datetime.MINYEAR

The smallest year number allowed in a date or datetime object. MINYEAR is 1.

datetime.MAXYEAR

The largest year number allowed in a date or datetime object. MAXYEAR is 9999.

  1. Time zones always differ by a whole hour

False. We showed in #18 that quarter-hour offsets from UTC exist in timezones, so it should be obvious that if hour-aligned and non-hour-aligned timezones exist, that this cannot be true.

  1. If you convert a timestamp with millisecond precision to a date time with second precision, you can safely ignore the millisecond fractions

I'm presuming because they'll be 0, is the assumption here? I'm not entirely sure what's going on. In our assumption, I think this should hold true.

  1. But you can ignore the millisecond fraction, if it is less than 0.5

I'm now lost as to the exact operation we're preforming on this poor timestamp.

  1. Two-digit years should be somewhere in the range 1900-2099

False. And Y2.1k was thus born…

  1. If you parse a date time, you can read the numbers character for character, without needing to backtrack

This really needs to be put within the context of what language we're attempting to parse, otherwise, it is unanswerable.

Even in ISO-8601, stumbling upon a W might be exciting, in that you're now parsing a week-date, and the year you just parsed might not actually be the year. Or if you hit a third digit in what you thought might be a month component, and thus you now know you're parsing an ordinal date.

54. But if you print a date time, you can write the numbers character for character, without needing to backtrack

Maybe. (so I guess false.) If you have a date stored as a UTC ISO timestamp in YYYY-MM-DD and you want it displayed the same… then yes, just output it!

  1. You will never have to parse a format like ---12Z or P12Y34M56DT78H90M12.345S

I really hope not.

  1. There are only 24 time zones

False. See the IANA timezone database. There's a ton. Way more than 24.

  1. Time zones are always whole hours away from UTC

False, see #18.

  1. Daylight Saving Time (DST) starts/ends on the same date everywhere

False. The DST start/end date is set by governments, and they'll never agree on such a thing. Even without that, you can reason through this: DST starts in the spring, with the "spring forward" event. Spring is in the first half of the year in the northern hemisphere, but the latter half for the southern hemisphere.

  1. DST is always an advancement by 1 hour

True. Right? Please? I'm curious to know who doesn't now…

  1. Reading the client’s clock and comparing to UTC is a good way to determine their timezone

False. You can try this, but an offset at a particular time can map to multiple timezones. In fact, the mapping between offsets and what timezones are in that offset changes during the year as people go in and out of DST and laws change.

  1. The software stack will/won’t try to automatically adjust for timezone/DST

False. Windows will adjust automatically in most cases. Linux, running with a TZ of UTC won't.

  1. My software is only used internally/locally, so I don’t have to worry about timezones

False. It's only a matter of time.

  1. My software stack will handle it without me needing to do anything special

False. The C standard library, for example, is pretty oriented towards some concept of the local time of the system, which isn't useful if you're writing a web server handling requests from users in a multitude of timezones.

  1. I can easily maintain a timezone list myself

False. See question #1: there were already two updates this year alone.

  1. All measurements of time on a given clock will occur within the same frame of reference.

False? Are we bringing relativity into this? Dammit Jim I'm a software engineer, not a quantum physicist.

  1. The fact that a date-based function works now means it will work on any date.

False, but mostly on the basis that we really need to qualify the "date-based function" under consideration here to understand the implications of saying anything about it.

  1. Years have 365 or 366 days.

True, but only under our assumption. The lost days during the adoption of the Gregorian calendar apply yet again. Also, if you define day as 60 * 60 * 24 seconds, then leap seconds ruin this as well.

  1. Each calendar date is followed by the next in sequence, without skipping.

True, but only under our assumption; again, the adoption of the Gregorian calendar.

  1. A given date and/or time unambiguously identifies a unique moment.

True-ish, but only under our assumption. Again, we get back to how Unix timestamps get all weird during leap seconds. Also again, if your definition of "date and/or time" refers to a local time in a timezone with DST and you don't specify whether you're in DST or not, the fallback will cause issues.

Outside our assumption, the adoption of the Gregorian calendar (are you tired of hearing that yet?) was not uniform (different places did so at different times), and thus, you need to know what calendaring system was effectively in use.

  1. Leap years occur every 4 years.

False, on so many levels. See the discussion about the rules for leap years in #10. 1896 was a leap year. 1900 wasn't. 1904 was, thus putting 8 years between leap years.

Further, we also have the problem of really early leap years, as discussed in #11 and the intro.

But we also have Sweden. Yes, Sweden. During the adoption of the… oh, you know the drill. Anyways, Sweden is very special. I'm just going to quote the whole thing, because it is so special.

Sweden's relationship with the Gregorian calendar was a difficult one. Sweden started to make the change from the Julian calendar and towards the Gregorian calendar in 1700, but it was decided to make the (then 11-day) adjustment gradually by excluding the leap days (29 February) from each of 11 successive leap years, 1700 to 1740. Meanwhile, the Swedish calendar would be out of step with both the Julian calendar and the Gregorian calendar for 40 years; also, the difference would not be constant but would change every four years. This system had potential for confusion when working out the dates of Swedish events in this 40-year period. To add to the confusion, the system was poorly administered, and the leap days that should have been excluded from 1704 to 1708 were not excluded. The Swedish calendar (according to the transition plan) should have been 8 days behind the Gregorian but was 10 days behind. King Charles XII recognised that the gradual change to the new system was not working, and he abandoned it. Rather than proceeding directly to the Gregorian calendar, it was decided to revert to the Julian calendar. This was achieved by introducing the unique date 30 February in 1712, adjusting the discrepancy in the calendars from 10 back to 11 days. Sweden finally adopted the Gregorian calendar in 1753, when Wednesday, 17 February, was followed by Thursday, 1 March. Since Finland was under Swedish rule at that time, it did the same.[7] Finland's annexation to the Russian Empire did not revert this, since autonomy was granted, but government documents in Finland were dated in both the Julian and Gregorian styles. This practice ended when independence was gained in 1917.

Mind you, I only come up with 10 leap years between 1700 to 1740, so I'm not sure it would have worked out anyways.

  1. You can determine the time zone from the state/province.

False. Texas, for example, straddles the Central and Mountain timezones in America.

  1. You can determine the time zone from the city/town.

False. Let's go find some town on the border.

  1. Time passes at the same speed on top of a mountain and at the bottom of a valley.

Now I know you're trying to bring relativity into this.

  1. One hour is as long as the next in all time systems.

False, leap seconds.

  1. You can calculate when leap seconds will be added.

False, unfortunately they're measured. Or, if you can, you should share this exciting discovery.

  1. The precision of the data type returned by a getCurrentTime() function is the same as the precision of that function.

False, and is basically as it says on the tin. The data structure will have its own precision: maybe it can store the time out to the nanosecond. The function will access some clock, and that clock's accuracy probably doesn't match the datastructure. Different computers might include clocks from different manufacturers, which have different precisions. I'd bet even hardware all made by the same manufacturer can have different precision.

And if your function returns the time out the nanosecond (not uncommon), it probably took more than a nanosecond to set up the function call in whatever calling convention you have, make the actual call, make a system call, read the hardware, encode the result…

  1. Two subsequent calls to a getCurrentTime() function will return distinct results.

False. Maybe your clock is only precise to the millisecond, but the CPU is capable of performing this function 20 times per millisecond.

Also, clock adjustments.

  1. The second of two subsequent calls to a getCurrentTime() function will return a larger result.

If 77 is false (and it is), then this must by definition be false as well. If you try to correct with "equal or larger", but clock adjustments still render this false.

  1. The software will never run on a space ship that is orbiting a black hole.

I really hope not.

@koalaman
Copy link

Holidays span an integer number of whole days

In Norway, Christmas Eve after 3pm counts as a holiday, and employers with normal business hours will often treat it as a half day. It's entirely normal for the holidays to be 2.5 days long.

@laurent-leconte
Copy link

  1. The weekend consists of Saturday and Sunday.

Depends on the country (specifically, Muslim countries and Israel). From Wikipedia:

In some Christian traditions, Sunday is the "day of rest and worship". Jewish Shabbat or Biblical Sabbath lasts from sunset on Friday to the fall of full darkness on Saturday; as a result, the weekend in Israel is observed on Friday–Saturday. Some Muslim-majority countries historically had a Thursday–Friday or Friday–Saturday weekend; however, recently many such countries have shifted from Thursday–Friday to Friday–Saturday, or to Saturday–Sunday. The French Revolutionary Calendar had ten-day weeks (called décades) and allowed décadi, one out of the ten days, as a leisure day.

@jeffreykemp
Copy link

It’s possible to establish a total ordering on timestamps that is useful outside your system.
True, but only within our assumption, even then, it is tenuous. And really, I think it depends on your timestamps and what you mean by total ordering.

I think this is more about the programmer assuming, when reading (say) records from a database, that they can reliably compare and sort the records based on the recorded timestamps. If the data was sourced from different servers, it's very likely that there will be some variability in the timestamps recorded, and that therefore the chances of a misordering occurring from time to time (no pun intended) are quite high.

@AllWorlds
Copy link

Ok, but the time on the server clock and time on the client clock would never be different by a matter of decades.

SSL assumes this. That's usually how I find out that a computer has the date grossly wrong, someone complains they can't access a website and that's why.

A time stamp of sufficient precision can safely be considered unique.

Theoretically true, but the needed precision is unobtainable. Considering timestamps generated at a random moment in the same second, then even with 0.1 nanosecond accuracy which is 1 clock cycle at 10 GHz, it'd only take a few billion timestamps for a collision to be more likely than not. We have a few billion computers in the world, and hard to see how the timestamp could meaningfully be more precise than the CPU clock cycles.

Days begin in the morning.

Astronomers take a convention of starting the day at noon, so that the observations through a single night are on the same 'day'.

Holidays span an integer number of whole days.

Besides the public holiday examples mentioned, think of schools and workplaces recording breaks/leaves to half-day accuracy.

Some new ones

DST involves one forward move and one backwards move per year? False, as DaveCTurner mentioned, Morocco does two.

Timezones range from UTC-12 to UTC+12? False, some places use offsets outside that range in order to match their day with neighbouring countries. (Ie Samoa).

If it's the same local time in two places, it's the same day in those two places? False because of the above. Monday 1pm in American Samoa is Tuesday 1pm in Tonga.

An event will take place at the same UTC time in all locations? False for solar eclipses, where the path of totality moves across the Earth.

The Gregorian date of a public holiday is known in advance? False for Islamic holidays, since the Islamic calendar relies on visual sightings of the new Moon.

Finally, you want a real Break Everything scenario? Samoa's switch from UTC-11 to UTC+13 was bad enough, but suppose a country makes the opposite switch. That will mean that a full 24-hour period duplicates itself in the country's local time. An 8-day week, possibly in a 32-day month in a 367-day year.

And finally finally,

I'm going to assume the programmer is allowed to use a Proleptic Gregorian calendar.

If you're working on a videogame with a space setting, you might be dealing with something totally different.

@LorenPechtel
Copy link

Another example of holidays that don't respect midnight: The Jewish day starts at sunset, so Jewish holidays are sunset to sunset.

http://www.jewfaq.org/holiday0.htm#Begin

@anarchivist
Copy link

Daylight saving time always adjusts by an hour.

True? Being on the list, I doubt my answer here, but I don't know of a case. Since I'm sure the author isn't bluffing, I'm going with he knows something I don't.

Historically, DST in Singapore also adjusted by as little as 20 minutes. See Why is Singapore in the "Wrong" Time Zone?

@hakanai
Copy link

hakanai commented May 1, 2018

Fine, but the duration of one minute on the system clock would never be more than an hour.

I honestly can't see when this would be false, honestly.

If local time was meant, then the answer is obviously daylight savings time. You start at half a minute before the clock moves forwards an hour, and then one minute of system time later, you're 65 minutes past where you were, in local time.

Daylight saving time always adjusts by an hour.

True? Being on the list, I doubt my answer here, but I don't know of a case. Since I'm sure the author isn't bluffing, I'm going with he knows something I don't.

Lord Howe Standard Time = UTC+10:30
Lord Howe Daylight Time = UTC+11:00

https://www.timeanddate.com/time/zone/australia/lord-howe-island

Unix time is the number of seconds since Jan 1st 1970.

True. See #14. I will note: since midnight Jan 1st 1970 UTC.

Since you know about leap seconds, it's mysterious that you seem to forget them for this one, but obviously UNIX time is the number of seconds since Jan 1st 1970 minus the current number of leap seconds in effect, which may at some point be negative.

@MikeRosoft
Copy link

MikeRosoft commented Apr 18, 2019

The software will never run on a space ship that is orbiting a black hole.

Okay, that's a troll. But there's a well-known example where relativistic effects need to be accounted for (neglecting it would introduce a significant and cumulative error): GPS.

@autinerd
Copy link

autinerd commented Jul 19, 2019

24:12:34 is a invalid time

Eh. Some people seem to think this is valid. This gets used to attach a time to a particular day that isn't really part of a particular day. ISO-8601 allows 24:00, but only that, I believe — i.e., the example in the article isn't valid within ISO-8601.

Maybe in the standard it's invalid. But internal train timetables are often going to 3:00 or 4:00 the next day, and trains starting at 22:00 on day 1 and end at 3:00 on day 2 may have this implemented as 27:00 on day 1.

@jcaesar
Copy link

jcaesar commented Nov 3, 2019

Any 24-hour period will always begin and end in the same day (or week, or month).

False, but I'm not sure I understand the assumption here. Clearly, a 24 hour period straddling a day cannot start and end in the same day, and such simple counter examples break the rest too.

While the answer is only wrong if you leave out the "straddling a day"… daylight saving time says hi. (And I just happened to find this list today…)

@jlturriff
Copy link

jlturriff commented Feb 3, 2023

Re #76, various models of IBM mainframe systems from the original S/360 to today's z/Series architecture (and perhaps other IBM architectures) have clock counters that run at various speeds, and the OS samples them at different bit lengths to obtain the requested value.
From the Enterprise Series Architecture/390 Principles of Operation (SA22-7201-08, DZ9AR008) manual:

Time-of-Day Clock

The time-of-day (TOD) clock provides a high-resolution measure of real time suitable for the indication of date and time of day. The cycle of the clock is approximately 143 years. In a configuration with more than one CPU, each
CPU may have a separate TOD clock, or more than one CPU may share a clock, depending on the model. In all cases, each CPU has access to a single clock.

Format

The basic TOD clock is a 64-bit register. It is extended with an additional 40 rightmost bits if the extended-TOD-clock facility is installed. For ease of description, the TOD clock is treated as a 104-bit register of which the rightmost 40 bits are visible only if the extended-TOD-clock facility is installed. The TOD clock is a binary counter with the format
shown in the following illustration.

       1 microsecond──┐
                      v
┌────────────────────┬─┬────┬─────────────────┐
│                    │ │    │                 │
│                    │ │    │                 │
└────────────────────┴─┴────┴─────────────────┘
                     51     64              103
                            └─Visible if the──┘
                             Extended-TOD-Clock
                           Facility Is Installed

The TOD clock nominally is incremented by adding a one in bit position 51 every microsecond. In models having a higher or lower resolution, a different bit position is incremented at such a frequency that the rate of advancing the clock is the same as if a one were added in bit position 51 every microsecond. The resolution of the TOD clock is such that the incrementing rate is comparable to the instruction-execution rate of the model.

@n0099
Copy link

n0099 commented Jun 15, 2023

https://gist.github.com/thanatos/eee17100476a336a711e?permalink_comment_id=2164874#gistcomment-2164874

Days begin in the morning.

Astronomers take a convention of starting the day at noon, so that the observations through a single night are on the same 'day'.

https://gist.github.com/thanatos/eee17100476a336a711e?permalink_comment_id=2974813#gistcomment-2974813

24:12:34 is a invalid time

Eh. Some people seem to think this is valid. This gets used to attach a time to a particular day that isn't really part of a particular day. ISO-8601 allows 24:00, but only that, I believe — i.e., the example in the article isn't valid within ISO-8601.

Maybe in the standard it's invalid. But internal train timetables are often going to 3:00 or 4:00 the next day, and trains starting at 22:00 on day 1 and end at 3:00 on day 2 may have this implemented as 27:00 on day 1.

https://stackoverflow.com/questions/5208607/parsing-times-above-24-hours-in-c-sharp
https://en.wikipedia.org/wiki/Date_and_time_notation_in_Japan#Time
https://ja.wikipedia.org/wiki/30%E6%99%82%E9%96%93%E5%88%B6

@rojalator
Copy link

1599 in Scotland was only 9 months long (25th March to 31st December) and New Year was changed to 1st January. England remained on the old system meaning 1st January 1600 in Scotland was 1st January 1599 in England. What larks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment