Big O notation is a way to describe how the time (or space, but we will see this later) a program needs grows as the amount of input data increases. Rather than focus on exact timing (like "this algorithm takes 0.002 seconds"), Big O notation helps us talk about how quickly that time grows when we feed the algorithm larger and larger inputs.
If your program handles small tasks, you might not notice how "fast" or "slow" it runs. But when the input becomes very large, like thousands or millions of records, inefficient algorithms can become painfully slow or consume a lot of memory. Big O is a tool that helps developers predict which algorithms will handle large inputs more in the most efficient way.