Typically, as english speaking humans in the modern era, we use a number system called "decimal". In decimal, we have ten different digits: 1, 2, 3, 4, 5, 6, 7, 8, 9, and 0. When we want to express a large number, we write a sequence of these digits to represent how many tens, hundreds, thousands, etc. For example, 25, we know that if we break this down, it's 2 tens plus 5 ones; 125 is 1 hundred, 2 tens, 5 ones. We can keep adding more digits in sequence and it becomes harder to name the positions that the digits are in, but we can understand the number that the sequence of digits represents.
Modern computers use a different system—binary—which has benefits that reduce complexity and and improve efficiency [🐇].