Skip to content

Instantly share code, notes, and snippets.

@fj
Created December 15, 2012 20:46
Show Gist options
  • Save fj/4299037 to your computer and use it in GitHub Desktop.
Save fj/4299037 to your computer and use it in GitHub Desktop.

Retrieval of an arbitrary item from a linked list is O(n). Arbitrary means any of the elements are eligible for selection, without loss of generality.

However, retrieval of a specific item from a linked list has nothing to do with how many items are in the list. It is therefore O(1). Retrieving the 50th item from a linked list takes the same amount of time, no matter whether the list has 50 elements, 50M elements, or 50^50! elements.

@fj
Copy link
Author

fj commented Dec 16, 2012

Things that are O(1) take the same amount of time regardless of the input. Critically, you must allow input to vary to make a O(1) claim.

I do allow the input to vary. Give me any linked list you like, and I can write you an O(1) time method that always takes the same asymptotic time to retrieve the 50th item -- a specific index.

If you say "well, you have to let me pick any item I want, not just the thing at index 50", then you are picking an arbitrary item, not a specific item. I do agree that retrieving an arbitrary item from a linked list is O(n) in time with respect to the size of the items in the list.

I think you might be conflating the definitions of "arbitrary" and "specific", and that is why you don't agree with me:

arbitrary (adjective):

  • of unspecified value

specific (adjective):

  • clearly defined or identified

But those are very different things. That's why the O(1) result is a little surprising, and it's why I think I'm correct in stating that, for a linked list, retrieving arbitrary items is O(n) and retrieving the item at a specific index is O(1).

Before you say "well that's trivial and degenerate", not every data structure has this "O(1) for a specific index" property. Extracting the third rope from a ten-rope knot takes more time when there are ten ropes instead of three.

@btaitelb
Copy link

We can prove this claim, by proving two lemmas:

  1. finding the kth element in a linked list is O(k) for any constant k
  2. O(k) = O(1) for any constant k

The first claim is straightforward, and we can use our intuitive understanding of O-notation. It takes k-steps to jump through k elements in a linked list, so not having any more information (like how the linked list is stored or elements cached), we can conclude that the worst-case runtime will be no better than O(k). You can also verify this with the formal definition (below), but that's left as an exercise for the reader ;)

Now, for the second claim, let's go to the definition. The wikipedia page ( http://en.wikipedia.org/wiki/Big_O_notation ) is fairly dense, so I much prefer the definition in "Introduction to Algorithms" by Cormen, Leiserson, and Rivest (my version is from 1998, so no Stein):

For a given function g(n), we denote by O(g(n)) the set of functions

O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ f(n)c g(n) for all nn0 }

In our problem, g(n) = 1 and f(n) = k

So to prove that O(k) is a proper subset of O(1), we need only find the positive constants c and n0 that satisfy this formula. Well, we can use whatever we want for n0, since n doesn't appear in either of our equations. Let's use 1 to simplify things. Now for choosing c, we can use any number greater than or equal to k, so how about k? Now we have:

∀n > 1, 0 ≤ k ≤ k * 1
f(n)O(g(n))
O(k) = O(1)

@deathbob
Copy link

Stunned you won't give up on this.

You guys are so wrong I'm seriously shaking my head over here. This is the last I'm going to say on the subject and if you have any more to say other than "I owe you a beer" then it better "I'm very sorry, you were right."

finding the kth element in a linked list is O(k) for any constant k

No. Just no. Finding the kth element in a linked list is O(n). Always. Specific or arbitrary. It so happens that the number of operations needed to reach a particular index is approximately equal to that index, but that doesn't mean that it runs in constant time. In fact that's exactly why it runs in O(n), or linear time. The runtime scales linearly with the index selected.

The fact that the runtime is identical on lists of different lengths is irrelevant and a red herring. What is relevant is that to reach the nth item you have to traverse all n - 1 elements before n.

It also happens that the number of operations to find the nth element is not affected by the length of the list, but again that doesn't mean it runs in constant time. The critical issue is that the runtime is affected by the selection of n.
Even if you don't vary n, the runtime is still affected by, and linearly proportional to, n. That's what makes it a O(n) algo.

Put another way, just because the run time is the same on different linked lists, doesn't mean that the algo runs in constant time. It's still running in linear time, you're just keeping the index you're looking up constant. A subtle, but important distinction.

If finding the kth element always took m operations, regardless of the value of k, then finding the kth element in a linked list would be O(m) which would then reduce to O(1). Unfortunately, because the index and cost of retrieval increase together in lock-step, no such claim can be made.


Big o is for talking about comparative or relative complexity, it is not a method for obtaining a specific count of steps. When you bring big o into it you are making a generalization and the only generalization that fits is that the runtime is linearly proportional to the index in question.

Specifically, if you're making a claim about the big o complexity for finding the nth item in a list, even if you don't intend to vary n (like by always choosing 50), any claim you make has to generalize to other values of n. Otherwise you're not doing big o. I don't know what you are doing but it isn't big o. And you sure as hell can't start saying O(this) or O(that).

BTW saying "finding the kth element takes k operations" is exactly the same as saying 'it runs in linear time" which is exactly the definition of O(n) algos.


Give me any linked list you like, and I can write you an O(1) time method that always takes the same asymptotic time to retrieve the 50th item -- a specific index.

No, barring some weird definition of 'linked list' the best you can do is write me a O(n) algo that always returns in the same amount of time because you're not varying n. If you did vary n runtime would change. In a true O(1) algo, such as indexing into an array, the runtime is the same no matter what index is selected. In your examples runtime is dependent on the index selected. That's a huge red flag that what we're dealing with is not a constant time algo.
Refusing to vary n is not sufficient to turn a O(n) algo into a O(1) algo.

If the index did vary you'd quickly appreciate the difference between a linear and a constant time algo.


Again, the original claim was

Retrieving the 10^(10!)th item in a linked list is "O(1)"

In a final desperate plea, I'm going to appeal to authority and link to the wikipedia page on this, which for the record I've only consulted for the first time just now. Note the use of the word specific @jf you smart-ass.
http://en.wikipedia.org/wiki/Linked_list#Speeding_up_search

I expect beers from both of you next we meet. :D

@fj
Copy link
Author

fj commented Dec 17, 2012

@deathbob
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment