glDrawPixels help; glDrawPixels vs. Texture Mapping for bitmaps

I hate to have to ask this, because it seems like it ought to be quite easy. I just can get glDrawPixels to display anything on the screen.
I use glDrawPixels(256,256,GL_COLOR_INDEX,GL_BITMAP,(unsigned char*)bitmapdata);
I get nothing. What should be equivalent code based on MSDN help: glBitmap(256,256,0,0,0,0,(unsigned char*)bitmapdata); works… sort of. I can use it display data with 1-bit color information (display white pixels on a black background).
My data is RGBA format.
My question is, am I missing some kind of initialization? Does anyone have a tutorial for using glDrawPixels and/or glBitmap?
I can’t find this in NeHe or the Redbook… my best resources.
And… the second part of my question:
I can get TextureMapping to work, and display a Quad at the appropriate distance for viewing the bitmap… but it seems to me, though I could be wrong, that this wouldn’t be nearly as fast as a Block Transfer of the pixels. Both should be supported by hardware. Also, other threads on this forum are saying that Texturing to a quad is lossy… does anyone know the performance different between glDrawPixels, and Texture Mapping? (assuming glDrawPixels uses hardware Blting).

Thanks in advance.
Mike.

“My data is RGBA format.”
Ehm, then why do you specify GL_COLOR_INDEX and GL_BITMAP?
Try glDrawPixels(width, height, GL_RGBA, GL_UNSIGNED_BYTE, bitmapdata);

If the data is not changing, downloading it as a texture and repeatedly displaying as textured quad should be faster. That’s the most used case for backdrops and such.

Well, OK… I see your point. The very first thing I tried was just as you say, like so: glDrawPixels(256,256,GL_RGBA,GL_UNSIGNED_BYTE,(unsigned char*)bitmapdata);
Exactly like that. Nothing displayed on the string, and then I tried glBitmap. glBitmap appeared to be reading the data, but it was reading it with 1-bit color information. I couldn’t figure out how to use color information with glBitmap. I looked in the MSDN documentation, and it claimed glBitmap should be almost identical to glDrawPixels with those options selected. So I was trying to use that as a starting point, and fix the color format later.
Either way, I can’t get glDrawPixels to actually display on the screen.
I’m having trouble following the documentation here… do I have to do anything with the frame buffer after calling glDrawPixels, as opposed to glBitmap, to actually get it to display?
Any help would be appreciated.
Thanks,
Mike.

No.
Just check the obvious…
you are malloc’ing 2562564*GLubyte
just before the glDraw pixels… loop through *bitmapdata to check you have correct values in there.

I just set up a gluortho2d display.
read in my bmp file
(just in case) set…
glRasterPos2i(0, 0);
glDrawPixels(256, 256, GL_RGB, GLubyte, myimage);

try making a test image (not read in data) like…
count=0;

for(y=0;y<256;y++)
for(x=0;x<256;x++)
for(rgb=0;rgb<3;rgb++)
myimage[count++] = x;

gav

Got what I needed…
Like Gavin said… check the obvious.
It works now… but I still can’t figure out why it didn’t before… must have made a mistake somewhere.
Thanks for the help.
Mike.