Skip to content

Instantly share code, notes, and snippets.

@anthonymonori
Created October 4, 2015 10:34
Show Gist options
  • Save anthonymonori/d7c79ff17952a7445b89 to your computer and use it in GitHub Desktop.
Save anthonymonori/d7c79ff17952a7445b89 to your computer and use it in GitHub Desktop.
Big Oh notation and everything you need to know about it - perfect for tech interview preparation!
Common Running Time
There are some common running times when analyzing an algorithm:
O(1) – Constant Time Constant time means the running time is constant, it’s not affected by the input size.
O(n) – Linear Time When an algorithm accepts n input size, it would perform n operations as well.
O(log n) – Logarithmic Time Algorithm that has running time O(log n) is slight faster than O(n). Commonly, algorithm divides the problem into sub problems with the same size. Example: binary search algorithm, binary conversion algorithm.
O(n log n) – Linearithmic Time This running time is often found in "divide & conquer algorithms" which divide the problem into sub problems recursively and then merge them in n time. Example: Merge Sort algorithm.
O(n2) – Quadratic Time Look Bubble Sort algorithm!
O(n3) – Cubic Time It has the same principle with O(n2).
O(2n) – Exponential Time It is very slow as input get larger, if n = 1000.000, T(n) would be 21000.000. Brute Force algorithm has this running time.
O(n!) – Factorial Time THE SLOWEST !!! Example : Travel Salesman Problem (TSP)
___________________________________________
1 - Basic Operations (arithmetic, comparisons, accessing array’s elements, assignment) : The running time is always constant O(1)
Example :
read(x) // O(1)
a = 10; // O(1)
a = 1.000.000.000.000.000.000 // O(1)
2 - If then else statement: Only taking the maximum running time from two or more possible statements.
Example:
age = read(x) // (1+1) = 2
if age < 17 then begin // 1
status = "Not allowed!"; // 1
end else begin
status = "Welcome! Please come in"; // 1
visitors = visitors + 1; // 1+1 = 2
end;
So, the complexity of the above pseudo code is T(n) = 2 + 1 + max(1, 1+2) = 6. Thus, its big oh is still constant T(n) = O(1).
3 - Looping (for, while, repeat): Running time for this statement is the number of looping multiplied by the number of operations inside that looping.
Example:
total = 0; // 1
for i = 1 to n do begin // (1+1)*n = 2n
total = total + i; // (1+1)*n = 2n
end;
writeln(total); // 1
So, its complexity is T(n) = 1+4n+1 = 4n + 2. Thus, T(n) = O(n).
4 - Nested Loop (looping inside looping): Since there is at least one looping inside the main looping, running time of this statement used O(n^2) or O(n^3).
Example:
for i = 1 to n do begin // (1+1)*n = 2n
for j = 1 to n do begin // (1+1)n*n = 2n^2
x = x + 1; // (1+1)n*n = 2n^2
print(x); // (n*n) = n^2
end;
end;
References:
http://stackoverflow.com/a/23168379/2759296
https://en.wikipedia.org/wiki/Time_complexity#Table_of_common_time_complexities
http://bigocheatsheet.com
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment