|
 |
Chambers schrieb:
> Only if you assume 'char' to be a text character, rather than an 8 bit
> byte (which is what it really is, just as 'int' isn't really a true
> integer, but rather a register-sized word on whatever machine you're
> compiling for).
I really must object here for nitpicking's sake:
char is /not/ an 8 bit byte.
char instead is a data object /at least/ 8 bit in size and being a
multiple of the architecture's basic storage unit. The exact size is
defined in the compiler's <types.h> as CHAR_BIT.
On some exotic 7-bit architecture, for instance, char would have to be
at least 14 bits in size.
There /are/ real-world architectures that do have 16-bit char types
because they cannot individually address smaller words.
On such architectures, it is vital to note that sizeof() always returns
a data object's size in multiples of whatever size a char occupies.
Furthermore, it should be noted that signed char and unsigned char /are/
integer types, and so are the short, int, long and long long types (both
unsigned and signed, even if not explicitly qualified as such).
And no, "int" it is /not/ a register-sized word, but rather an integer
occupying (but not necessarily fully utilizing) at least the "natural
size suggested by the architecture of the execution environment", while
at the same time guaranteeing to store any integer number in the range
from -32767 (sic!) to 32767, with the exact limits being defined as
INT_MIN and INT_MAX in <types.h>.
C data types can be real fun once you're leaving the safety of
mainstream PC architecture :-P
Post a reply to this message
|
 |