I have an algorithm for converting YUV to RGB.(this algorithm is part of jpeg decoder). This algorithm was originally written for
some hardware. Now when I use this alogrithm in my PC with windows XP platform, my decoded image looks inverted.

Now my Question, Are YUV to RGB algorithms platform dependent.

Did you check the endianness (little endian/big endian) of the two hardware setups. Or perhaps the output is stored as BGR instead of RGB…


If you are using libjpeg check the jmoreconfig.h file.
Maybe it’s really BGR :smiley:

#define RGB_RED		0	/* Offset of Red in an RGB scanline element */
#define RGB_GREEN	1	/* Offset of Green */
#define RGB_BLUE	2	/* Offset of Blue */
#define RGB_PIXELSIZE	3	/* JSAMPLEs per RGB scanline element */

I wrote a little YUV to RGB converter in GLSL for my company, and found that I needed to do a fair amount of math to get the equations in FourCC to work in a YUV vector to RGB vector transform matrix. It doesn’t exactly “fall into the matrix” unfortunately. Things looked awful funny before I did that; perhaps it’s the same thing you’ve got. Can we see a screenshot?