which
I understand it fine. However, for something like 20 years in my world, "Octal" meant an 8-bit value represented by a base-8 numbering system, while "Hex" means "Hexadecimal" which is a 16-bit value represented by a base-16 numbering system. I'm simply saying that, in my (software) environment, I'm used to "octal" being smaller than "hex". Thinking of it otherwise does not come natural.