I don’t want it to be part of opengl specs, but I do want an ARB_fragment_lighting extension!! Anyone knows if the EXT_fragment_lighting extension is being worked on, SGI seemed to stop the development 3 years ago. Now that we have hardware capable of doing per-pixel stuff such an extension would be ideal!!!
Honestly, we really ought to have per-pixel lighting without having to jump through hoops (like programming Register Combiners directly). A nice glLightModel(GL_PHONG_SHADING) would be a good idea, and quite doable, granted current hardware.
Well, phong shading can be done per-vertex, so its actually not what we want .
As I said, ARB_fragment_lighting is the way IMO (take a look at ext_fragment_lighting specs! very interesting). The only problem is that the number of supported lights will likely be very limited.
“Well, phong shading can be done per-vertex, so its actually not what we want.”
You can’t really do Phong shading per-vertex (OK, you can hack it with a texture, but that’s not a real lighting computation). Goroud shading is defined as computing any lighting per-vertex and interpolating the vertex colors across the polygon. Phong shading is defined as interpolating the normals per-fragment and doing the lighting computation there.
I would, also, suggest that any per-pixel lighting extension should provide for a new type of texture map: a bump map. The extension should rigorously define the texture’s format. This form of bump mapping should not be tangent space bump mapping (which can be somewhat incorrect in certain situations), but actual normal + offset bump mapping. There sould only be 1 actual bump map active at any one time (to make it easier on implementations).
Granted all of these hardware-specific extensions running around, us developers have to test for each extension and develop for each, just to get some effect that all of the hardware can easily handle. It would be nice if all of the hardware developers got together and created a set of widely supported extensions for various effects. They’ve given us lots of general-purpose tools, but what we really want are specific effects.
I was thinking of phong shading as of calculating the phong lighting term per-vertex and then interpolating it (linearly). As opposed to current method in opengl: the lighting equation used is blinn’s(well actually only the specular terms are different between phong and blinn).
An yes, IMO normal maps can be easily integrated into any state machine such as opengl. Its just the same as with any other things: vertex, color, normal, texture, …
I even could imagine a new texture target: GL_NORMAL_MAP_2D or something like that. THis way no new funcions are needed, but one could continue to use old ones: glBindTexture(GL_NORMAL_MAP_2D, id), …
Just a small thought,
I think that phong shading pretty much equals per pixel lighting. If some interpolation of the colors is used between the vertices, then it becomes just like gouraud shading.
One of the current technics is to use register combiners, but that involves using pre-generated textures which is nothing to get excited about IMO.
There’s a lot of basic needs that ogl has not adressed yet … sadly.
There’s a difference between phong shading, and the phong lighting model. Phong shading means you interpolate the normal and evaluate the lighting per pixel using any lighting model you want. The phong lighting equation is pretty bad actually, since it doesn’t contain any specualar colour, everything tends to look plastic. It differs from OpenGL in the way it computes the specular though, OpenGL’s approach can be more efficient under certain circumstances.