POV-Ray : Newsgroups : povray.off-topic : C/C++ Data Type Ambiguity Backwards : Re: C/C++ Data Type Ambiguity Backwards Server Time
5 Jul 2024 09:36:14 EDT (-0400)
  Re: C/C++ Data Type Ambiguity Backwards  
From: clipka
Date: 23 Aug 2015 08:52:59
Message: <55d9c22b@news.povray.org>
Am 23.08.2015 um 10:54 schrieb Orchid Win7 v1:
> On 21/08/2015 01:12 PM, clipka wrote:
>> Okay, I guess everyone who has ever touched C or C++ has at least heard
>> rumors of this: The standard data types, such as "int", "short int" or
>> "long int", are anything but.
> 
> If I'm remembering my history right, C was basically invented to write
> Unix in. From the very beginning, it was a programming language
> *specifically designed* for system programming.
> 
> You know, the kind of programming where knowing exactly how many bits
> you're dealing with is 100% critical.

That's only true /some/ of the time; and for those cases, C has bit
fields, which provide far more fine-grained control than any
guaranteed-exact-size type system could ever give you.

More to the point, system programming is the kind of programming where
performance is 100% critical /all/ of the time - and where you therefore
want to use the machine's native data types almost everywhere, rather
than some guaranteed-exact-size type system that might impose an
unnecessary overhead on your particular machine.

Also, it was designed back in the times when "portability" wasn't equal
to "interchangeability"; who cared whether your system used the same
inode size as anyone else - you wouldn't physically mount its hard drive
into another machine anyway. You wouldn't even physically mount your
removable storage media on any other machine. You only /had/ that one
machine.

Networking - yeah, that might have been a bit tedious; but back then
that was only an ever so tiny portion (and as mentioned before bit
fields would be your friend there; ever tried to assemble a raw IP frame
in Java?); most data transfer to the outside world would have been to
and from terminals, with links that would use character-based data
transfer, and hardware that would automatically trim your smallest
native data type ("char") to whatever bits per character the serial link
was configured to use - which more often than not would have been 7
rather than 8.


> And yet, this is one of the few programming languages on Earth which
> doesn't guarantee how many bits are in a particular data type, and
> provides no way to specify what you actually want.
> 
> Does that seem weird to anybody else??

No, not really, for the above reasons.


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.