2009-03-18, 00:21
After reading a bit about OpenGL and endianness (here is a good link: OpenGL Universal Binary Programming Guidelines) my understanding is that when using GL_BGRA format, one should use GL_UNSIGNED_INT_8_8_8_8_REV type (and not GL_UNSIGNED_BYTE).
With GL_BGRA format, GL_UNSIGNED_BYTE gives the same as GL_UNSIGNED_INT_8_8_8_8_REV on little-endian system.
But that's not the case on big-endian system!
I tried this fix on my Mac mini and the colors are fine when playing video :-)
Can someone who really knows about OpenGL confirm that it's the right thing to do?
With GL_BGRA format, GL_UNSIGNED_BYTE gives the same as GL_UNSIGNED_INT_8_8_8_8_REV on little-endian system.
But that's not the case on big-endian system!
I tried this fix on my Mac mini and the colors are fine when playing video :-)
Can someone who really knows about OpenGL confirm that it's the right thing to do?