#defined bitflags and enums - peaceful coexistence at & ldquo; C & rdquo;

advertisements

I have just discovered the joy of bitflags. I have several questions related to "best-practices" regarding the use of bitflags in C. I learned everything from various examples I found on the web but still have questions.

In order to save space, I am using a single 32bit integer field in a struct (A->flag) to represent several different sets of boolean properties. In all, 20 different bits are #defined. Some of these are truly presence/absence flags (STORAGE-INTERNAL vs. STORAGE-EXTERNAL). Others have more than two values (e.g. mutually exclusive set of formats: FORMAT-A, FORMAT-B, FORMAT-C). I have defined macros for setting specific bits (and simultaneously turning off mutually exclusive bits). I have also defined macros for testing if specific combination of bits are set in the flag.

However, what is lost in the above approach is the specific grouping of flags that is best captured by enums. For writing functions, I would like to use enums (e.g., STORAGE-TYPE and FORMAT-TYPE), so that function definitions look nice. I expect to use enums only for passing parameters and #defined macros for setting and testing flags.

  1. (a) How do I define flag (A->flag) as a 32 bit integer in a portable fashion (across 32 bit / 64 bit platforms)?

  2. (b) Should I worry about potential size differences in how A->flag vs. #defined constants vs. enums are stored?

  3. (c) Am I making things unnecessarily complicated, meaning should I just stick to using #defined constants for passing parameters as ordinary ints? What else should I worry about in all this?

I apologize for the poorly articulated question. It reflects my ignorance about potential issues.


As others have said, your problem (a) is resolvable by using <stdint.h> and either uint32_t or uint_least32_t (if you want to worry about Burroughs mainframes which have 36-bit words). Note that MSVC does not support C99, but @DigitalRoss shows where you can obtain a suitable header to use with MSVC.

Your problem (b) is not an issue; C will type convert safely for you if it is necessary, but it probably isn't even necessary.

The area of most concern is (c) and in particular the format sub-field. There, 3 values are valid. You can handle this by allocating 3 bits and requiring that the 3-bit field is one of the values 1, 2, or 4 (any other value is invalid because of too many or too few bits set). Or you could allocate a 2-bit number, and specify that either 0 or 3 (or, if you really want to, one of 1 or 2) is invalid. The first approach uses one more bit (not currently a problem since you're only using 20 of 32 bits) but is a pure bitflag approach.

When writing function calls, there is no particular problem writing:

some_function(FORMAT_A | STORAGE_INTERNAL, ...);

This will work whether FORMAT_A is a #define or an enum (as long as you specify the enum value correctly). The called code should check whether the caller had a lapse in concentration and wrote:

some_function(FORMAT_A | FORMAT_B, ...);

But that is an internal check for the module to worry about, not a check for users of the module to worry about.

If people are going to be switching bits in the flags member around a lot, the macros for setting and unsetting the format field might be beneficial. Some might argue that any pure boolean fields barely need it, though (and I'd sympathize). It might be best to treat the flags member as opaque and provide 'functions' (or macros) to get or set all the fields. The less people can get wrong, the less will go wrong.

Consider whether using bit-fields works for you. My experience is that they lead to big code and not necessarily very efficient code; YMMV.

Hmmm...nothing very definitive here, so far.

  • I would use enums for everything because those are guaranteed to be visible in a debugger where #define values are not.
  • I would probably not provide macros to get or set bits, but I'm a cruel person at times.
  • I would provide guidance on how to set the format part of the flags field, and might provide a macro to do that.

Like this, perhaps:

enum { ..., FORMAT_A = 0x0010, FORMAT_B = 0x0020, FORMAT_C = 0x0040, ... };
enum { FORMAT_MASK = FORMAT_A | FORMAT_B | FORMAT_C };

#define SET_FORMAT(flag, newval)    (((flag) & ~FORMAT_MASK) | (newval))
#define GET_FORMAT(flag)            ((flag) & FORMAT_MASK)

SET_FORMAT is safe if used accurately but horrid if abused. One advantage of the macros is that you could replace them with a function that validated things thoroughly if necessary; this works well if people use the macros consistently.