I’ve heard alot of talk lately about shaders. More specifically, I’ve heard alot about shaders in general, and then apart from that I’ve heard alot about the new vertex and pixel shaders. But I’ve seen screenshots and heard people talking about shaders they’ve already coded using existing hardware/APIs. So what are they talking about? How do you code these and what for? Any info would be greatly appreciated!
Usually when people talk about shaders on current hardware, they mean a script interpreter that they have written to handle texture and multipass effects.
This can include multitexturing, lightmapping, and other special effects.
How does Quake3 sort after textures? Does it sort after the first pass, then after the second etc.? Is there a shader defined for every face?
Quake3 appears to use multiple sorting keys. First it sorts faces by the explicit or implicit sort parameter of the shader assigned to the face. Next it sorts each shader by gl state (which involves texture, depth testing, alpha testing, lighting, etc.). Of course the faces using transparent textures get sorted by depth instead (unless explicitly indicated otherwise).
Where can one find information about how to code such shaders and script interpreters? Thanks for the info!
there’s a quake 3 arena shader manual if you want to know how a shader engine actually works.
[This message has been edited by holocaust (edited 02-06-2001).]
Thanks! I’ll check it out.
You can also find interesting stuff at Stanford; http://graphics.stanford.edu/projects/shading
and at SGI on OpenGL Shader
Both want to envision a real-time shader for OpenGL and can compile
shaders from RenderMan shader language. SGIs OpenGL Shader
was only IRIX. RTSL (Stanford) is available for IRIX,Linux and Win2K.
Please try it. There are some associated papers with theories/details.