|
 |
Darren New schrieb:
> I also believe (but I am too lazy to look it up right now) that it's
> impossible to portably know whether
>
> struct PackBits alpha = {0, 1, 0, 0};
> struct PackBits beta = {0, 0, 0, 1};
>
> alpha or beta will yield a larger number when cast to an int. I.e., I
> don't think the standard even says whether the fields are allocated MSB
> or LSB first.
That's absolutely right: While standard C by now /does/ address the
problem of detecting the range of values a certain integer type can
hold, how many bits a "char" has, and even what base, mantissa size and
exponent size a floating point format uses - still nobody has introduced
anything that would make it possible to detect endianness at compile time.
Multi-character constants may be a way to detect this with some
reliability, by testing e.g.
#if '\xAB\xCD\xEF\xGH' == 0xABCDEFGH
#define BIG_ENDIAN
#elif '\xAB\xCD\xEF\xGH' == 0xGHEFCDAB
#define LITTLE_ENDIAN
#else
#define UNKNOWN_ENDIAN
#end
but the C99 standard does not explicitly specify any byte ordering rules
of multi-character constants either.
The best you can do is include a runtime self-test routine in the code
to actively check whether compile-time endianness assumptions were right.
Post a reply to this message
|
 |