|
 |
>> It just seems way more natural if you are writing any sort of processing
>> with bits/bytes etc that array index zero should be the LSB.
>
> ...and this *is* a good argument??
Yes, because people write algorithms to deal with this way more often than
writing out byte streams by hand.
> Think about it. If you store the textural representation of a number in
> ASCII,
Which nobody does, and as pointed out, when people do write out numbers they
are *right justified* exactly so that all the LSBs come *first* when you run
some algorithm on them (eg adding).
> the most significant digit comes first. But if Intel has their way, the
> most significant byte comes... last? And if you have to split a large
> number into several machine words,
I don't know about you, but if I want to get a huge (>8bit) number from the
bytes, I would do this:
HugeInt myHugeInteger = 0;
for(b=0;b<ArrayOfBytes.Length;b++)
myHugeInteger += ArrayOfBytes[b] * (256^b);
How is that "backwards"?
Post a reply to this message
|
 |