In programming we are constantly applying different layers of abstraction. At the root level data is just 1 and 0 and moving between memory cells and registers. Most people have no concern for that when they program. They have an object oriented or strongly typed language that adds a layer of abstraction over the internal machine code, instructions, and bytes.
Worrying about the endian of integers is a case of not looking at problems form the correct level of abstraction, in almost all cases. It is very rare a programmer will ever have to worry about the details of how the hardware stores the number in memory -- unless you are writing in machine language or doing hardware level data transfer it won't matter.
The compiler (yes even low level compilers like C) will take care of it for you. It just won't matter as long as you use the abstract typing of compiler.
Lets say you need to create a transport protocol -- in this case it may or may not matter depending on the protocol. If the protocol is in JSON or so