POV-Ray : Newsgroups : povray.off-topic : Programming langauges : Re: Programming langauges Server Time
5 Sep 2024 13:12:47 EDT (-0400)
  Re: Programming langauges  
From: clipka
Date: 26 Oct 2009 11:59:14
Message: <4ae5c752$1@news.povray.org>
Invisible schrieb:
>>> It still makes me sad that Intel chose to store bytes in the wrong 
>>> order all those years ago...
>>
>> Define "wrong" in this context...
> 
> When you read data from a first, you read the first byte first, and the 
> last byte last. Therefore, the first byte should be the MSB.

Seriously: Why? Just because us Westerners writing numbers this way round?

Note that this digit ordering has its drawbacks; as a practical example, 
have a look at any data table: While all the text columns are usually, 
left-adjusted, numbers are always right-adjusted, because otherwise they 
wouldn't line up.


 > But no, Intel decided that this would be too easy.

It wasn't Intel who made that decision. As mentioned before, they were 
not the only ones to follow this path.


 > And now we have the spectacle
> of cryptosystems and so forth designed with each data block being split 
> into octets and reversed before you can process it...

That is not because little-endian would be wrong, but because the 
cryptosystems usually happen to be specified to use big-endian in- and 
output (owing to the fact that big-endian has become quasi-standard in 
Internet protocols). They would work just as well with little-endian 
values - but then the big-endians would have to do the byte reversal, or 
the algorithms would simply be incompatible.

So you can just as well blame this problem on the people who kept 
insisting on big-endian byte ordering.


>> It so happens that Intel format is actually doing it "right" in this 
>> respect: AFAIK two of the most important serial interfaces - RS232 and 
>> Ethernet - both transmit each byte starting with the least significant 
>> bit first.
> 
> Well then that would be pretty dubious as well. (AFAIK, MIDI does it the 
> correct way around.)

Again: Why would that be dubious? The only reason I can think of is the 
convention how we write numbers, but that's just a convention, and - as 
shown above - not even an ideal one.

Wouldn't it be more convenient to start with the least significant 
digit, and stop the transmission once you have transmitted the most 
significant nonzero digit? If you did that the other way round, you'd 
have no way to know how long the number will be, and won't be able to 
determine the actual value of each individual digit until after the 
transmission.


>> BTW, Intel is not the only company that preferred little-endian 
>> convention.
> 
> Indeed, the 6502 did it decades ago. Still doesn't make it right.

And your arguments so far don't make it wrong. Unconventional maybe (in 
the most literal sense) - but not wrong.

I think the terms "little-endians" and "big-endians" for adherents of 
each philosophy are pretty suitably chosen (see "Gulliver's Travels" by 
Jonathan Swift).


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.