But let's not forget that we're talking about *computers*, here, not human beings. Calling a typical embedded CPU an "advanced thinker" would be quite a stretch of the imagination, wouldn't you agree?
One thing that speaks against big-endian in written numbers for human beings is that you can't really read them out in the order they're written, anyway. You have to first count the digits before you can start, which means you have to go back and forth at least once. If numbers were written in little-endian, you could start reading them as the digits come by, and just keep on using larger multipliers until you reach the end of the strings of digits.
Actually, when you go and really *process* digits of a multi-digit number, e.g. you add or multiply long numbers on paper, you'll find yourself doing it from the LSD to the MSD, not the other way round, because just like the early CPUs, it's easier to work in the same direction the carry is handed over.