|
 |
Invisible schrieb:
> Indeed. Considering that C is supposedly *the* language for writing
> low-level code, I find it quite surprising that it provides absolutely
> no way to select a binary number of a specific size.
That's actually its strength... until it comes to interfacing with the
outside world, which is where things tend to get ugly.
Not commonly known, C99 also specifies that <stdint.h> shall contain
typedefs for various integers to be used in case a particular size is
desired:
- intN_t / uintN_t (e.g. int8_t) for exact size:
These are optional, unfortunately, but it makes sense considering that
some architecture may have a bigger smallest-addressable-word size.
- int_leastN_t / uint_leastN_t for a certain minimum size:
These are mandatory at least for 8, 16, 32 and 64 bit.
- int_fastN_t / uint_fastN_t for fastest type of mininum size:
These are mandatory at least for 8, 16, 32 and 64 bit.
> It doesn't even
> seem to provide a vague suggestion of what size you're going to get;
> it's just random pot luck with each compiler you try...
Oh yes, it does: <types.h> gives you all you need to know about your
int, short, long etc.
Well, /almost/ all: For some weird reason nobody seemes to have bothered
mandating a standard #define to figure whether you're on a big-endian or
little-endian machine.
Post a reply to this message
|
 |