¶The BI_BITFIELDS bitmap type
Everyone knows how to deal with bitmaps in Windows, right? The ones that are described by BITMAPINFOHEADER?
Enter a seldom-used bitmap mode called BI_BITFIELDS.
Most bitmaps in Windows have their biCompression field set to BI_RGB, which just indicates a straight indexed or direct-mapped RGB encoding. This encompasses indexed color formats of 2, 4, 16, or 256 colors, as well as 16-bit, 24-bit, and 32-bit direct color encodings. That handles virtually everything that you'll ever encounter, except for one gotcha. People who are new to BITMAPINFOHEADER often screw up in handling the 16-bit format, because they don't realize that it is actually 15 bits -- five bits of red/green/blue each, and one unused high bit. This is known as a 555 or 1555 encoding. Despite this format having only 15 significant bits, it still uses a bit depth value of 16, which is what confuses newbies.
Windows does actually support a true 16-bit encoding, which consists of five bits of red and blue each and six bits of green, or a 565 encoding. You can't get to it via BI_RGB, however; you have to use the special BI_BITFIELDS compression value instead. This adds three bit masks after the biClrImportant field that specify the exact bit locations of the red, green, and blue fields. When these bit masks are 0000F800, 000007E0, and 0000001F, respectively, the bitmap uses a 565 16-bit encoding. Windows GDI supports this format because a lot of popular graphics hardware used to support only a particular 16-bit frame buffer format and 565 was one of the common ones.
The reason I bring this up is that I recently started rewriting the part of VirtualDub that handles BI_BITFIELDS images as part of some other work. You might be wondering if anyone uses BI_BITFIELDS encoding, especially in AVI files, and the answer is that it's extremely rare. However, one thing I've learned from working on this program is that almost anything is possible will eventually show up in the wild; for instance, I once had to fix a crash caused by trying to import images produced by a popular SNES emulator that used BITMAPCOREHEADER instead of BITMAPINFOHEADER in the format block. Anyway, a particularly interesting way you can get BI_BITFIELDS in a video stream is using GraphEdit to decompress a video from a decompression filter that supports 565 for display performance issues and happens to have it listed first in the format list of the output pin. Therefore, it's one of the things I've been regression testing.
...Only to discover that most other programs don't handle BI_BITFIELDS properly to begin with.
It turns out that there are only four direct color formats that Windows GDI is guaranteed to handle: 16-bit (555), 16-bit (565), 24-bit (888), and 32-bit (888). Win9x OSes will only handle BI_BITFIELDS images with these four encodings. Of these, three of them are accessible via BI_RGB, with 565 being the only one that absolutely requires the bitfield encoding. Presumably for this reason, it seems that just about every video technology I have installed was written with the assumption that BI_BITFIELDS always means 565, including core components of Microsoft DirectShow like the Color Space Converter. If you use BI_BITFIELDS with the 555 masks (00007C00 / 000003E0 / 0000001F), the video player still tries to decode it as 565 and displays garbage. Hey guys, know those other fields in the bitmap structure? You're supposed to check them!
When things get really fun, however, is when you take into account that Windows NT GDI is considerably more powerful and will accept arbitrary non-overlapping, contiguous bit masks. Current versions of VirtualDub fall back to GDI's BitBlt() when encountering a BI_BITFIELDS video stream, and thus if you are running on Windows NT4/2000/XP/Vista, you'll be able to open most BI_BITFIELDS encoded videos. I wrote a test program to convert a BMP image sequence to BI_BITFIELDS encoded AVI files with user-specified bit masks, and GDI handled many other popular formats through BI_BITFIELDS, such as BGR 555 and RGB 444, and even 2/10/10/10. I say almost, because GDI doesn't seem to like it if you use a zero bit mask for a channel, although interestingly it doesn't mind if you use 0x10000 in a 16-bit format. Ultimately useless, but I list it here for completeness and since it was trivial to get it working in the rewritten bitfields code.
I didn't have much of a point in writing this other than sharing what I'd found during testing, but if you were looking for a conclusion here, I'd say: make sure you validate the bit masks when processing BI_BITFIELDS, and if you write bitmaps or video with that format, only do so for 565 and otherwise use BI_RGB whenever possible.