Skip to content

Instantly share code, notes, and snippets.

@dovakeen118
Created February 28, 2023 14:35
Show Gist options
  • Save dovakeen118/b586101daade3dcc10448a0359570f43 to your computer and use it in GitHub Desktop.
Save dovakeen118/b586101daade3dcc10448a0359570f43 to your computer and use it in GitHub Desktop.

Big O Notation

A Discussion of Algorithms


Our Journey So Far

  • We've been learning a lot of practical / procedural knowledge
  • We've learned there is more than one way to solve a software problem
  • We've seen a sampling of technical interview questions

A Little Theory Goes a Long Way

  • Employers still want a firm basis of theoretical understanding
  • One area of focus: algorithmic complexity

Wait, What's an Algorithm?

  • A fancy word for a: step-by-step way to solve a problem

IE, write a method that determines if an item appears in an array:

const exists = (array, target) => {
  for (element of array) {
    if (element === target) {
      return exists
    }
  }
}

We could also use .find()


Algorithmic Complexity

  • We've coached: sloppy code is better than no code
  • As our apps and problems become more complex with larger volumes of data, how will our apps perform?
  • How does the exists method perform for large data sets?
const exists = (array, target) => {
  for (element of array) {
    if (element === target) {
      return exists
    }
  }
}

Other Questions

Imagine if the array is over 10,000 elements long...

  • How many operations/loops if the item we're looking for is the first item in the list?
  • How many operations/loops if the item we're looking for is at the end of the list?
const exists = (array, target) => {
  for (element of array) {
    if (element === target) {
      return exists
    }
  }
}

Big O Explained

Big O is here to help us decipher how complex or expensive an algorithm is with massive data sets

  • Imagine a really large number of items in the array (close to infinity). That n
  • For every loop "around the race track" there is a computational expense, we call Big-O
    • Created by Paul Bachmann, O stands for "Ordnung", or the order of approximation

The worst case performance of this algorithm is O(n)


Big O - worst case

The worst case performance of this algorithm is O(n)

Reasoning

Worst case scenario: If the item we're seeking is either not in the list, or at the very end of the list, we have to loop through ALL of the items in the array. It takes "n" amount of time (or computational expense) in that bad scenario.


Why Should You Care?

  • Computer Scientists (that's you) will interact with large volumes of data, thus algorithmic performance and code at scale is important
  • Generally, you simply need to be able to take a measure of the general complexity of the code you are working with, and be able to speak to whether it is inefficient or increasingly efficient, etc.

Performance Profiles

Big O Performance Chart


Constant Complexity O(1)

  • Accessing an item in an array
  • No matter how large the data set is, the time to access a single item is constant.
list = [1,4,6,7]
list[1]

1 item takes 1 second, 10 items takes 1 second, 100 items takes 1 second.


Logarithmic Complexity - O(log(n))

  • Good performance at scale
  • usually pre-sorted list
  • As data volume grows, efficiencies are gained

Logarithmic Complexity

1 item takes 1 second, 10 items takes 2 seconds, 100 items takes 3 seconds.


The Classic Logarithmic Example - Binary Search

const binarySearch = (n, array) => {
  let middle = array.length / 2
  let i = 0
  let j = array.length - 1

  while (i < j) {
    if (array[middle] === n) {
      return true
    } else if (array[middle] < n) {
      i = middle + 1
      middle = i + j / 2
    } else {
      j = middle - 1
      middle = i + j / 2
    }
  }

  return false
}

Binary Search - Animated

Binary Search


Linear Complexity - O(n)

Linear Complexity

Simple iteration of array with n elements

1 item takes 1 second, 10 items takes 10 seconds, 100 items takes 100 seconds.


Linear Complexity example

What happens if for some reason (who knows why) we have to iterate over the array twice?

const exists = (array, target) => {
  let check1, check2

  for (element of array) {
    if (element === target) {
      check1 = true
    }
  }

  for (element of array) {
    if (element === target) {
      check2 = true
    }
  }
  return check1 && check2
}

Linear Complexity

Complexity: Big O(2n)

But this isn't meaningful enough of a difference. We might want to update our code to speed it up somehow, but this isn't a big enough leap for it too matter too much.

However, if the thing we are measuring is page re-renders, then this actually might matter a lot

Point: Big O is more concerned with dramatic changes in complexity. However, if each process is already costly, we'll have lower tolerance for more complexity.


Quadratic Complexity - O(n^2)

  • Generally encountered when dealing with nested loops, or lookups while looping.

Quadratic Complexity

1 item takes 1 second, 10 items takes 100, 100 items takes 10000.


Quadratic Complexity Example

const allCombinations = (theList) => {
  let results = []
  theList.forEach((outerItem) => {
    theList.forEach((innerItem) => {
      results.push([outerItem, innerItem])
    })
  })

  return results
}

Poor Performers - O(2^n), O(n!)

  • Each new addition of data dramatically impacts performance
  • These algorithms do not scale!

The Classic O(n!) Example - The Traveling Salesman

Traveling Salesman


Example

Performance benchmarking in ms, n = # of houses

bestTravelingSalesmanRoute(1) // 0.02 ms
bestTravelingSalesmanRoute(2) // 0.04 ms
bestTravelingSalesmanRoute(10) // 42.08 ms
bestTravelingSalesmanRoute(12) // 5231.54 ms => 5 seconds
bestTravelingSalesmanRoute(13) // 69565.01 ms => 70 SECONDS!
bestTravelingSalesmanRoute(14) // SMOKE & FLAMES!!

Big Takeaways

  • Big O represents worst-case performance of an algorithm.
  • Linear and logarithmic complexities are usually in acceptable bounds.
  • O(2^n) and O(n!) performing algorithms do not scale.

Ways to remember different complexities

  • Direct access? O(1)
  • Iterating over a list once? O(n)
  • Does processing decrease over time for large datasets? Does it bisect nodes? O(log n)
  • Nest iteration? O(n^2)

For More Detail

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment