Big O Notation is a description of how a given block of code scales as more data is given to it for processing. In more concrete terms, Big O notation describes how many times a given piece of code might loop. It describes this in terms of n which is the count of objects that the code has to process. Big O Notation matters to everyone who programs a computer in any capacity, as it informs the feasibility and speed of various actions, and helps programmers decide if a given strategy is likely to be effective for their data set. Big O Notation transcends languages, environments, and toolsets, it's a product of the definition of classical computing.
For code that does not loop, it is said to be, in Big O notation, O(1). This is because the code has a constant execution time that does not increase as the dataset gets larger. A simple loop is O(n). As n increases, the execution time increases in a linear and predictable manner. 5 items takes 5 times as long to process as 1 item. If an algorithm required compa