Most programming languages have basic types such as booleans, bytes, shorts, integers, floating point numbers, strings, lists, etc. If you squint your eyes a bit, you'll notice all of these are just a bunch of bytes with varying sizes and shapes.
Strings (arrays) are (possibly unbounded) sequence of bytes. Integers and floating point types are often represented as fixed number of bytes or bits. Booleans semantically means "one bit", but often are implemented as a single byte (for performance reasons).
Now, think how many different values or states can each of these represent. Well, obviously the answer if 2**(number of bits in the type), duh!
- Strings (arrays) can encode billions, trillions, gazillions ... of different states.
- Integers can represent quite many, but considerably less (~4 billion for 32 bits)