OpenGL background gradient

Hi there!
I am developing an application that allows you to view the effects of different real time lighting models on different 3d meshes. I am trying to get a gradient background color like something you would see in Zbrush. Here is how I am currently trying to do this

void TestGLCanvas::Render()
	wxPaintDC dc(this);

#ifndef __WXMOTIF__
	if (!GetContext()) return;


	// Init OpenGL once, but after SetCurrent
	if (!m_init)
		m_init = true;


	glFrustum(-0.5f, 0.5f, -0.5f, 0.5f, 1.5f, 15.0f);//1

	/* clear color and depth buffers */

	//Gradient background
	glMatrixMode (GL_MODELVIEW); 
	glPushMatrix (); 
	glLoadIdentity ();
	glMatrixMode (GL_PROJECTION);
	glPushMatrix (); 
	glLoadIdentity ();


	glBegin (GL_QUADS);
	glVertex3f (-1.0f, -1.0f, -1.0f); 
	glVertex3f (1.0f, -1.0f, -1.0f); 

	glVertex3f (1.0f, 1.0f, -1.0f); 
	glVertex3f (-1.0f, 1.0f, -1.0f); 
	glEnd ();

	glPopMatrix (); 
	glMatrixMode (GL_MODELVIEW); 
	glPopMatrix ();

	//draw our mesh


It just doesn’t seem to work and ends up looking weird like this

I am using GLSL! maybe this is messing things up!
Any help would be appreciated!!!

If shaders are active, they will override any shading you have unless you reimplement FFP features in your shader. If you post your shader code (or the relevent portion), it would help.

If you’re using shaders, glEnable(GL_LIGHTING) doesn’t do any anything.

Are you concerned about the black triangle in the background? If I were to guess, I notice you’re using the identity matrix for the projection and modelview. That is perfectly fine, but that’s a -1x1 box in every direction so you’re pushing the z-coordinate to the back completely flush with the far plane. This might be problematic.

Either set it to something a little lower like -.9999, or since you’re not writing to depth buffer anyway, just set it to 0.

So whats the best thing to do at this point?
Should I try to implement this gradient background in the shader or something!
Can you give me an idea of how I might do this in a shader?

Should I do something like this vertex shader?

uniform vec3 ambient[4];
uniform vec3 diffuse[4];
uniform vec3 specular[4];
uniform vec3 hhat[4];
uniform vec3 vp[4];
uniform vec3 sceneAmbient;

void main(void)
	vec3 mEm = vec3(gl_FrontMaterial.emission);
	vec3 mAmb = vec3(gl_FrontMaterial.ambient);
	vec3 mDif = vec3(gl_FrontMaterial.diffuse);
	vec3 mSpec = vec3(gl_FrontMaterial.specular);
	float mShine = gl_FrontMaterial.shininess;

	vec3 color = mEm + sceneAmbient * mAmb;
	vec3 normal = gl_Normal;

	for (int i = 0; i < 4; ++i) {
		color += ambient[i] * mAmb;

		float df = max(0.0, dot(vp[i], normal));
		color += diffuse[i] * mDif * df;

		float sf = df <= 0.0 ? 0.0 : max(0.0, dot(hhat[i], normal));
		sf = pow(sf, mShine);
		color += specular[i] * mSpec * sf;

	gl_FrontColor = vec4(color, 1.0);
	gl_Position = ftransform();

Would I load a shader like this directly before drawing my gradient backgorund, Then when I go to render my 3d mesh load the shader I intended to use?

First of all, did you try moving the quad off the far plane to get rid of the black triangle in the corner?

Second, you have to decide how the lighting of the background is actually occurring. Jumping over to a zBrush lighting tutorial, it looks like the lights in the scene (adding or moving) don’t affect the background image at all. As such, I don’t think any lighting is really going on there of consequence.

My advice is just to make a vertex shader that just passes on the color and a fragment shader that just uses the color. Specify the colors using glColor() and the GPU will interpolate the gradient across. Or you don’t even need a shader and can just turn them and GL_LIGHTING off and this will get the same effect.