glDrawPixels() Help!

I’m brand new to OpenGL, so of course I’m having issues.

I have a 24-bit bitmap file, which is an 8x8 checker pattern, black and white. I just want to draw it to the screen using glDrawPixels:

#include <iostream>
#include <fstream>
#include <gl/glut.h>

using namespace std;

void display(void)
{
ifstream inFile;
inFile.open(“checker.bmp”, ios::in | ios::binary);
char contents[246];
inFile.read(contents, sizeof(contents));

glClearColor(1.0, 1.0, 1.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);

glRasterPos2f(0.0, 0.0);
glDrawPixels(8, 8, GL_UNSIGNED_BYTE, GL_RGB, &contents[54]);

glFlush();
}

int main(int argc, char* argv[])
{
glutInit(&argc, argv);
glutInitDisplayMode (GLUT_SINGLE);
glutInitWindowSize (600, 600);
glutInitWindowPosition (200, 100);
glutCreateWindow (“Test Program…”);
glutDisplayFunc(display);
glutMainLoop();
return EXIT_SUCCESS;
}

The window gets drawn white and my bitmap is nowhere to be found.

Also, I read that bitmaps are stored BGR style, but passing GL_BGR isn’t recognized by the compiler (???). For black and white it shouldn’t matter, though, since my pixels are stored like FFF000FFF000FFF… anyway. The pixel data starts at byte 54 in the file.

I’m sure there’s a more elegant way to read the file, etc, but I don’t see why this shouldn’t work.

http://www.opengl.org/documentation/specs/man_pages/hardcopy/GL/html/gl/drawpixels.html

look at the format for glDrawPixels().I think GL_RGB comes ahead of GL_UNSIGNED_BYTE

I also think your missing some setup steps.
I would create an init function that does the following things:
Set the viewport. Read the image and store it in a global, then close the file. This way you only read it once, not on every redraw. Initialize the projection/modelview matricies to put the origin on the right spot.
The function would look something like this:


char contents[246];
void init(void) {
	ifstream inFile;
	inFile.open("checker.bmp", ios::in | ios::binary);
	inFile.read(contents, sizeof(contents));
	inFile.close();
	
	glViewport(x origin, y origin, width, height);
	
	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();
	glOrtho(left, right, bottom, top, near, far);
	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
        glClearColor(1.0, 1.0, 1.0, 0.0);
}

The redbook, even the old online version, has some good examples. If this doesn’t make sense, then look at those.

Thanks guys! I forgot about setting up matrices…

So… now I’m trying to draw with color and my B/R components are swapped. Like I mentioned, GL_BGR is giving me a compiler error (undeclared identifier)…

This leads me to yet another problem, more related to C++ language I would think - I read that GL_BGR isn’t recognized by versions older than OpenGL 1.2, so I was trying to print my version of OpenGL with:

GLubyte* version = glGetString(GL_VERSION);
cout.write(version, 16);

Which gives these errors for each respective line:

error C2440: cannot convert from ‘const GLubyte *’ to ‘GLubyte *’
error C2664: cannot convert parameter 1 from ‘GLubyte *’ to ‘const char *’

I’ve tried many different combinations of reinterpret_cast to no avail.

std::string m_strVersion;
m_strVersion = (const char *)glGetString(GL_VERSION);

Any good?

const GLubyte* version = glGetString(GL_VERSION);

No good!

Abdallah DIB’s code gives this:

cannot convert parameter 1 from ‘const GLubyte *’ to ‘const char *’

This is very frustrating, because by looking at the code it should be converting a ‘const GLubyte*’ (the return type) to a ‘const GLubyte*’… there are no char*'s!!!

And scratt’s code crashes at runtime, pulling up an error window that says:

Debug Assertion Failed! … invalid null pointer

Anyone?

glGetString won’t work until you call glutCreateWindow(…), so that may explain the null pointer.

Am I wrong or is this already the case?

glutCreateWindow (“Test Program…”);
glutDisplayFunc(display);

glutCreateWindow is called first…

I think cout.write wants a const char*, so that’s where that error comes from. Putting these two lines either in the main function after glutCreateWindow(…) or in the display function works for me.

const GLubyte * m_strVersion = glGetString(GL_VERSION);
cout.write((const char*)m_strVersion, 16);

Okay! Great thanks for the help.

It works for me now (although it’s GLubyte, not GLuByte, right?). I realized that I was trying to call it from an init function that I was running before createWindow. My bad!

So… my version is 1.5.0 Build 7, yet GL_BGR still isn’t recognized… is there another restriction on this option I’m not seeing?

Here’s my very total guess:
The headers don’t match the drivers that are actually running. I know I’ve seen a couple posts about not being able to get access to better than 1.1 on Windows, and this might be the same thing. Try searching for those posts and see what turns up.

So it’s GL_BGR_EXT, which isn’t mentioned in the reference guide for glDrawPixels, but whatever, it works and I’m happy.

Thanks to everyone!