|
![](/i/fill.gif) |
Orchid Win7 v1 <voi### [at] dev null> wrote:
> And yet, this is one of the few programming languages on Earth which
> doesn't guarantee how many bits are in a particular data type, and
> provides no way to specify what you actually want.
C doesn't want to force the compiler to generate inefficient code
behind the scenes in order to support some particular type. For example,
if the target hardware doesn't support, let's say, 32-bit integers,
C doesn't want to force the compiler to generate inefficient code
that handles 32-bit integers on that system. It allows the compiler to
refuse support.
In principle C tries to be as portable as possible in the sense that it
makes no assumptions about the target hardware. It doesn't even assume
that a byte is 8 bits, and it doesn't make any assumptions about the
bitness of the target platform. (I think that in principle it doesn't
even assume that integral variables use 2's complement representation.)
The only guarantee that it gives is that sizeof(char) <= sizeof(short)
<= sizeof(int) <= sizeof(long). (There might have been a guarantee that
sizeof(char) < sizeof(long), but I don't remember if that's true.)
--
- Warp
Post a reply to this message
|
![](/i/fill.gif) |