POV-Ray : Newsgroups : povray.off-topic : Programming langauges Server Time
5 Sep 2024 13:16:07 EDT (-0400)
  Programming langauges (Message 61 to 70 of 114)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: clipka
Subject: Re: Programming langauges
Date: 26 Oct 2009 10:59:01
Message: <4ae5b935$1@news.povray.org>
Invisible schrieb:
> Captain Jack wrote:
> 
>> I remember having written a game on a portable Unix machine (with a 
>> Motorala 68010 processor) and being amazed at what happened with my 
>> save files when I moved to my first DOS machine (with an 80386 
>> processor); I hadn't ever had to deal with byte order before that. In 
>> that case, the save data was relatively small, so I re-wrote it to 
>> save in ASCII printable characters, which solved that problem. :) 
> 
> It still makes me sad that Intel chose to store bytes in the wrong order 
> all those years ago...

Define "wrong" in this context...

As a matter of fact, the only situation where byte ordering can be 
defined as "right" or "wrong" with irrefutable arguments is in serial 
transmission, dependent on the native bit ordering of the physical 
layer: If the physical layer sends each byte starting with the least 
significant bit, then consistency would demand to send multi-byte values 
starting with the least significant byte, so that all in all the least 
significant bit of the multi-byte value is sent first; on the other 
hand, if the physical layer transmits the most significant bit of each 
byte first, the same reasoning would mandate sending the most 
significant byte first. There are other arguments pro and contra both 
little and big endian, but none as compelling as serial transmission.

It so happens that Intel format is actually doing it "right" in this 
respect: AFAIK two of the most important serial interfaces - RS232 and 
Ethernet - both transmit each byte starting with the least significant 
bit first.

So in this sense the "network byte ordering" used for multy-octet data 
in most Internet standards is actually a crappy convention, as the bits 
of multi-byte data will be transmitted in an inconsistent order.


BTW, Intel is not the only company that preferred little-endian convention.


Post a reply to this message

From: Invisible
Subject: Re: Programming langauges
Date: 26 Oct 2009 11:16:05
Message: <4ae5bd35$1@news.povray.org>
>> It still makes me sad that Intel chose to store bytes in the wrong 
>> order all those years ago...
> 
> Define "wrong" in this context...

When you read data from a first, you read the first byte first, and the 
last byte last. Therefore, the first byte should be the MSB. But no, 
Intel decided that this would be too easy. And now we have the spectacle 
of cryptosystems and so forth designed with each data block being split 
into octets and reversed before you can process it...

> It so happens that Intel format is actually doing it "right" in this 
> respect: AFAIK two of the most important serial interfaces - RS232 and 
> Ethernet - both transmit each byte starting with the least significant 
> bit first.

Well then that would be pretty dubious as well. (AFAIK, MIDI does it the 
correct way around.)

> BTW, Intel is not the only company that preferred little-endian convention.

Indeed, the 6502 did it decades ago. Still doesn't make it right.


Post a reply to this message

From: Captain Jack
Subject: Re: Programming langauges
Date: 26 Oct 2009 11:33:36
Message: <4ae5c150$1@news.povray.org>
"Invisible" <voi### [at] devnull> wrote in message 
news:4ae5bd35$1@news.povray.org...
> When you read data from a first, you read the first byte first, and the 
> last byte last. Therefore, the first byte should be the MSB. But no, Intel 
> decided that this would be too easy. And now we have the spectacle of 
> cryptosystems and so forth designed with each data block being split into 
> octets and reversed before you can process it...

That kinda sounds like saying that the only correct way to read a human 
language is left to right.

.ti ot desu t'nera yeht fi neve ,enif tsuj tfel ot thgir daer nac elpoeP

:D


Post a reply to this message

From: Invisible
Subject: Re: Programming langauges
Date: 26 Oct 2009 11:43:47
Message: <4ae5c3b3$1@news.povray.org>
>> When you read data from a first, you read the first byte first, and the 
>> last byte last. Therefore, the first byte should be the MSB.
> 
> That kinda sounds like saying that the only correct way to read a human 
> language is left to right.

As far as I know, even in languages that are usually written from right 
to left, the most significant digit is still written "first" and the 
least written "last".

But hey, why worry? I'm communicating with you right now using a code 
that still has codepoints resurved for obsolete control signals like 
BEL, RS, SYN and EOT...

(Don't get me started on the whole EOL marker thing!)


Post a reply to this message

From: scott
Subject: Re: Programming langauges
Date: 26 Oct 2009 11:55:08
Message: <4ae5c65c$1@news.povray.org>
> When you read data from a first, you read the first byte first, and the 
> last byte last.

OK...

> Therefore, the first byte should be the MSB.

That's not a very good argument.

Little-endian rulez!

It just seems way more natural if you are writing any sort of processing 
with bits/bytes etc that array index zero should be the LSB.


Post a reply to this message

From: clipka
Subject: Re: Programming langauges
Date: 26 Oct 2009 11:59:14
Message: <4ae5c752$1@news.povray.org>
Invisible schrieb:
>>> It still makes me sad that Intel chose to store bytes in the wrong 
>>> order all those years ago...
>>
>> Define "wrong" in this context...
> 
> When you read data from a first, you read the first byte first, and the 
> last byte last. Therefore, the first byte should be the MSB.

Seriously: Why? Just because us Westerners writing numbers this way round?

Note that this digit ordering has its drawbacks; as a practical example, 
have a look at any data table: While all the text columns are usually, 
left-adjusted, numbers are always right-adjusted, because otherwise they 
wouldn't line up.


 > But no, Intel decided that this would be too easy.

It wasn't Intel who made that decision. As mentioned before, they were 
not the only ones to follow this path.


 > And now we have the spectacle
> of cryptosystems and so forth designed with each data block being split 
> into octets and reversed before you can process it...

That is not because little-endian would be wrong, but because the 
cryptosystems usually happen to be specified to use big-endian in- and 
output (owing to the fact that big-endian has become quasi-standard in 
Internet protocols). They would work just as well with little-endian 
values - but then the big-endians would have to do the byte reversal, or 
the algorithms would simply be incompatible.

So you can just as well blame this problem on the people who kept 
insisting on big-endian byte ordering.


>> It so happens that Intel format is actually doing it "right" in this 
>> respect: AFAIK two of the most important serial interfaces - RS232 and 
>> Ethernet - both transmit each byte starting with the least significant 
>> bit first.
> 
> Well then that would be pretty dubious as well. (AFAIK, MIDI does it the 
> correct way around.)

Again: Why would that be dubious? The only reason I can think of is the 
convention how we write numbers, but that's just a convention, and - as 
shown above - not even an ideal one.

Wouldn't it be more convenient to start with the least significant 
digit, and stop the transmission once you have transmitted the most 
significant nonzero digit? If you did that the other way round, you'd 
have no way to know how long the number will be, and won't be able to 
determine the actual value of each individual digit until after the 
transmission.


>> BTW, Intel is not the only company that preferred little-endian 
>> convention.
> 
> Indeed, the 6502 did it decades ago. Still doesn't make it right.

And your arguments so far don't make it wrong. Unconventional maybe (in 
the most literal sense) - but not wrong.

I think the terms "little-endians" and "big-endians" for adherents of 
each philosophy are pretty suitably chosen (see "Gulliver's Travels" by 
Jonathan Swift).


Post a reply to this message

From: Invisible
Subject: Re: Programming langauges
Date: 26 Oct 2009 12:03:16
Message: <4ae5c844$1@news.povray.org>
scott wrote:

> That's not a very good argument.
> 
> It just seems way more natural if you are writing any sort of processing 
> with bits/bytes etc that array index zero should be the LSB.

...and this *is* a good argument??



Think about it. If you store the textural representation of a number in 
ASCII, the most significant digit comes first. But if Intel has their 
way, the most significant byte comes... last? And if you have to split a 
large number into several machine words, unless you number those words 
backwards, you get monstrosities such as

   04 03 02 01 08 07 06 05 0C 0B 0A 09

instead of

   01 02 03 04 05 06 07 08 09 0A 0B 0C

as it should be.

There's no two ways about it. Little endian = backwards.


Post a reply to this message

From: Invisible
Subject: Re: Programming langauges
Date: 26 Oct 2009 12:10:47
Message: <4ae5ca07$1@news.povray.org>
>> When you read data from a first, you read the first byte first, and 
>> the last byte last. Therefore, the first byte should be the MSB.
> 
> Seriously: Why? Just because us Westerners writing numbers this way round?

Because the most significant digit is the most important piece of 
information. It's "most significant".

>> And now we have the spectacle
>> of cryptosystems and so forth designed with each data block being 
>> split into octets and reversed before you can process it...
> 
> That is not because little-endian would be wrong, but because the 
> cryptosystems usually happen to be specified to use big-endian in- and 
> output.

Erm, NO.

This happens because most cryptosystems are (IMHO incorrectly) specified 
to *not* use big-endian encoding.

This means that if I want to implement such a system, I have to waste 
time and effort turning all the numbers backwards before I can process 
them, and turning them back the right way around again afterwards. It 
also means that when a paper says 0x3847275F, I can't tell whether they 
actually mean 3847275F hex, or whether they really mean 5F274738 hex, 
which is a completely different number.

> Wouldn't it be more convenient to start with the least significant 
> digit, and stop the transmission once you have transmitted the most 
> significant nonzero digit? If you did that the other way round, you'd 
> have no way to know how long the number will be, and won't be able to 
> determine the actual value of each individual digit until after the 
> transmission.

If you start from the least digit, you *still* can't determine the final 
size of the number until all digits are received.


Post a reply to this message

From: clipka
Subject: Re: Programming langauges
Date: 26 Oct 2009 12:14:26
Message: <4ae5cae2$1@news.povray.org>
scott schrieb:

> Little-endian rulez!
> 
> It just seems way more natural if you are writing any sort of processing 
> with bits/bytes etc that array index zero should be the LSB.

I must say that I see benefits in both conventions.

For instance, the arbitration scheme used on a CAN bus mandates that 
bits are sent highest bit first, so that the numerical value of the data 
identifier directly governs the message priority (lower data ID = higher 
message priority).


Post a reply to this message

From: Darren New
Subject: Re: Programming langauges
Date: 26 Oct 2009 12:17:56
Message: <4ae5cbb4$1@news.povray.org>
Invisible wrote:
> As far as I know, even in languages that are usually written from right 
> to left, the most significant digit is still written "first" and the 
> least written "last".

I'm pretty sure that in arabic, where numbers come from, the LSB is written 
first (i.e., arabic, right-to-left, writes digits in the same order as we do).

The benefit of MSB first is that it's in "readable" order. The benefit of 
LSB first is that you can easily find the LSB in case you want to move the 
bytes from a long to a short, for example.

I saw one discussion that printed hex dumps with the addresses incrementing 
for the ASCII part and decrementing for the numeric part, with the address 
in a column down the middle, which made reading little endian numbers trivial.

And since it's all imaginary anyway, and there is *NO ORDER* to bytes in a 
computer, it's all just mathematical relationships between powers of a 
polynomial and addresses, any argument you make against LSB vs MSB you can 
also make against writing polynomials one way or the other.

-- 
   Darren New, San Diego CA, USA (PST)
   I ordered stamps from Zazzle that read "Place Stamp Here".


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.