{{ message }}

Instantly share code, notes, and snippets.

pkulev/o-notation.md

Last active Apr 2, 2018
Efficiency and Orders of Growth

Efficiency and Orders of Growth

Measuring complexity:

• time
• memory

Mesuring base steps

• use a random access machine as model of computation

• steps are executed sequentially
• step is an operation that takes constant time
• assignment
• comparison
• arithmetic operation
• accessing object in memory
• for value of input measure time in terms of size of input

linear search:

• best case: const
• worst case: linear, depends on size of list
```def fact(n):       # number of steps:
answer = 1       # 1 (assignment)
while n > 1:     # 1 (comparison) <---------------+
answer *= n    # 2 (assignment, + operation)    |  5 * n
n -= 1         # 2 (assignment, - operation) <--+
return answer    # 1 for return```

n (linear) <= 2 + 5 * n steps

• Given this difference in iterations through loop multiplicative factor (number of steps within loop) probably irrelevant.
• Thus, we will focus on measuring the complexity as a function of input size:
• will focus on the largest factor in this expression
• will be mostly concerned with the worst case scenario

Asymptotic notation

• Need a formal way to talk about relationship between running time and size of inputs.
• Mostly interested in what happens as size of inputs gets very large, i.e. approaches infinity.
```def f(x):
for i in range(1000):
ans = i

for i in range(x):
ans += 1

for i in range(x):
for j in range(x):
ans += 1```

Overall complexity is `1000 + 2x + 2x^2`.

Rules of thumb for the complexity

• Asymptotic complexity:

• Describe running time in terms of number of basic steps.
• If running time is sum of multiple terms, keep one with the largest growth rate.
• If remaining term is a product, drop any multiplicatition constans.
• Use "Big O" notation (aka Omicron):

• Gives an upper bound on asymptotic growth of a function.

Complexity classes

``````* O(1) denotes constant running time                            |
* O(log*n*) denotes logarithmic running time                    |
* O(n) denotes linear running time                              |
* O(n * log*n*) log-linear running time                         |
* O(n ^ c) denotes polinomial running time (c -- constant)      |
* O(c ^ n) denotes exponential running time                    \|/
``````

Constant complexity

• independent of inputs
• very few interesting algorithms in this class, but can often have pieces that fit this class
• can have loops or recursive calls, but number of iterations independent of size of input

Logarithmic complexity

• complexity grows as log of size of size of one of its inputs
• example:
• bisection search
• binary search of a list

Linear complexity

• searching a list on order to see if an element is present
• example:
• add characters of a string assumed to be composed of decimal digits
```def add_digits(s):  # complexity = O(len(s))
val = 0
for c in s:
val += int(c)```
• complexity can depend on number of recursive calls:
```def fact(n):  # complexity = O(n)
if n == 1:
return 1
else:
return n * fact(n - 1)```

Log-linear complexity

• many practical algorithms are log-linear
• very commonly used log-linear algorithm is merge sort

Polinomial complexity

• most common polynomial algorithms are quadratic, i.e. complexity grows with square of size of input