|
 |
Invisible schrieb:
> Think about it. If you store the textural representation of a number in
> ASCII, the most significant digit comes first. But if Intel has their
> way, the most significant byte comes... last? And if you have to split a
> large number into several machine words, unless you number those words
> backwards, you get monstrosities such as
>
> 04 03 02 01 08 07 06 05 0C 0B 0A 09
That only happens if, for some obscure reason, you try to interpret a
/character sequence/ as an integer value. Which it isn't.
Directly interpreting ASCII representations of numbers is made
problematic in multiple ways anyway:
- The ASCII representation has no fixed width; The sequence "42" may
represent the value 4*10+2, but it may just as well be just part of a
much bigger value, such as 4*1e55 + 2*1e54 + something.
- You may encounter digit grouping characters such as "," (or, depending
on language, even "." or what-have-you).
- The numerical base does not match anything remotely suitable for
digital representation.
- Half of each byte (actually even a few fractions of a bit more) just
contains redundant overhead.
> instead of
>
> 01 02 03 04 05 06 07 08 09 0A 0B 0C
>
> as it should be.
>
> There's no two ways about it. Little endian = backwards.
No - you've got the whole thing wrong. Technically, little- and
big-endian are "up-" and "downwards", as in the physical memory address
lines and data lines are orthogonal to one another:
01
02
03
04
05
06
07
08
09
...
If you need to group these, you'll get:
01020304
05060708
090A0B0C
...
or
04030201
08070605
0C0B0A09
...
neither of which has any inherent inconsistency.
Post a reply to this message
|
 |