POV-Ray : Newsgroups : povray.off-topic : For Warp : Re: For Warp Server Time
5 Sep 2024 23:17:50 EDT (-0400)
  Re: For Warp  
From: clipka
Date: 26 Jun 2009 13:35:02
Message: <web.4a45066530d22038a745f7570@news.povray.org>
Darren New <dne### [at] sanrrcom> wrote:
> Mueen Nawaz wrote:
> >     Think modems (the phone ones). In the old days, it was all baud. 1
> > baud is 1bit/s. It probably has stuck since.
>
> Technically, one baud is one symbol per second. A 9600bps modem is a 2400
> baud modem with 2 symbols per baud.

What? That's nonsense, the units don't match here. You probably mean 2 bits per
symbol.

But even after *prehistoric* times, when transmission was almost always binary
(i.e. 1 bit per symbol), the misconception of "baud = bps" stuck for quite a
while.


> >     Incidentally, I think with old modems, sending 1 byte didn't entail
> > sending 8 bits. I think they had two error correcting bits, making
> > sending 1 byte the same as 10 bits. If there's truth to that, I can see
> > why it'd make sense to specify bits.
>
> One or two start bits to synchronize, one stop bit (or sometimes 1.5 stop
> bits) to process, and perhaps a parity bit. No error correcting bits, but
> parity might give you error detection if you're lucky.

The parity would typically be used with only 7 data bits.


But as a matter of fact, Kilo-, Mega- and GigaBITS are actually a much more
natural unit of measuring information storage or transmission capacity.

It's not only commonly used with transmission media, but also in
microelectronics, where (except in the "computer proper" business) you'll
virtually never find storage capacity specified in multiples of bytes *except*
when the storage device happens to be designed to access 8 bits in parallel. In
all other cases, chips are usually specified either at multiples of bits, or
multiples of a data word (the number of bits accessed in parallel), which would
typically be 1, 4, 8 (sometimes 9) or 16 bits.

A 2G DDR2 memory module should actually be designated as a 32Mx64 module. Would
be hard to explain to the customers, who are almost invariably locked with the
concept of a byte as the smallest unit of memory.

Happens to hardcore software developers, too. I once had some problems figuring
out why some intended-to-be-portable C software library I had developed didn't
run properly on a particular microcontroller - until finding out that a "char"
data type (the smallest data type in C, typically equalling a byte) was
actually 16 bits wide on that rascal. "A byte? What's that, Sir?" Doesn't make
life easier when you're developing a portable library for a communications
interface heavily based on the concept of the 8-bit thing.


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.