bitmap animations

Ok, so I’m new at OpenGL. Seems like it should be pretty easy, but I’m stumped already.

I have raw RGB data. I know the width and height, and I can create a bitmap file (providing the DIB and palette) easily from the data.

What I would like to do is simply create a 2D surface, and display the image. I don’t need to map it onto a cube and rotate it in space! I just want to see if I can map the image to a simple surface.

Eventually I will animate the surface with a series of bitmaps, but for now, I just want to show one. Every example I see uses 24bpp, and I’d rather not convert (time is important)

Any information would be greatly appreciated.

Thanks,

mmm to display something on openGL you should draw it on a polygon. In your case you have to put the bitmap on a texture and then map the texture on a camera aligned polygon (quad).

You will find tons of tutorial/example on how to load a texture and draw it on a quad.

You can send 24bpp texture to the driver, but the video card need 32bpp texture cause alignment problem, so the driver will convert the texture from 24 to 32 internally.
If speed is critical you have to specify better your problem (how fast and how big). But generally modern computer don’t have problem uploading decent size (up to 1024x1024) video.
You can use a double double VBO go a bit faster. In this case is a bit more complex.

Thanks Rosario,

More specifics: I’m trying to implement something that will work on android, iPhone, Windows Mobile… fairly low end devices though they support openGL. My raw data is 16bpp, streamed from a server – all I really want to do is have GL manage the double buffering and draw the “frames” at some number of frames per second.

I’m stuck on the 16bpp. I have found numerous examples on loading 24bpp bitmaps from a file, and mapping them onto a texture. The parameters to glTexImage2D are not obvious as to what to use.

Thanks,

glTexImage2D(GL_TEXTURE_2D,0
,GL_RGB8
// what will be stored/used on the hardware
,width,height,0
,GL_RGB // or GL_BGR, depends on how your 16 bits *data is ordered
,GL_UNSIGNED_SHORT_5_6_5 // depends on how your 16 bits *data is ordered (I admit I don’t know the actual difference with GL_UNSIGNED_SHORT_5_6_5_REV)
,data);

Hey, that works!! Thanks!

+_+