|
|
Nicolas Alvarez <nic### [at] gmailcom> wrote:
> clipka wrote:
> > Happens to hardcore software developers, too. I once had some problems
> > figuring out why some intended-to-be-portable C software library I had
> > developed didn't run properly on a particular microcontroller - until
> > finding out that a "char" data type (the smallest data type in C,
> > typically equalling a byte) was actually 16 bits wide on that rascal. "A
> > byte? What's that, Sir?" Doesn't make life easier when you're developing a
> > portable library for a communications interface heavily based on the
> > concept of the 8-bit thing.
> A C 'char' type MUST be exactly 1 byte long (in particular, sizeof(char)
> MUST be 1). However, a C implementation may define "byte" with more than 8
> bits.
Does the C standard use the word "byte", or does it simply say that
sizeof(char) must always be 1 (without specifying a name for the unit)?
Of course if you need to take into account the amount of bits in your
integral types (including char) and you want your C program to be fully
portable even to exotic embedded systems, you have to use the CHAR_BIT
constant defined in limits.h rather than assume it's always 8.
--
- Warp
Post a reply to this message
|
|