I’m relatively new to OpenGL programming, and most of my experience has come from DirectX. In DirectX building a pixel shader is pretty simple to setup. I can’t seem to figure out how to get a pixel shader up and running in OpenGL. It’s okay for it to be nVIDIA specific (GF3, GF4 only).
I can’t figure out how to get my pixel shader assembly read in, it looks like nvparse () uses a proprietary language for accomplishing the same thing. Is there an equivilent to the ps.1.1 language that I’m used to in DX for OGL, or should I try to port it over to the nVIDIA “texture shader” language?
Thanks for help guys
The Direct3D pixel shader programs you used before cannot be directly used in OpenGL. You will need to use a combination of the following nVidia extensions to do the same thing:
NV_texture_shader, shader2, and if you use a GeForce 4, shader3.
You will have to convert your shaders to use these API’s rather than a simple string.
Okay, I’m reading through all the docs on nVIDIA’s website trying to figure out how the texture shader stuff works. Is there anywhere that simple examples are available? All that I can find out there are really complex examples, where I just need something simple enough to tell me how to get something up and running.
Also, do texture shaders have access to interpolated vertex data like pixel shaders in DirectX (ie v0, v1)? All that I can find literature on talks about interpolated texture coordinates.
nvparse is like a scripting language, so if your use to that kind of coed, then you should try to get that up and running.
I pretty much like sticking close to the API as much as possible.
You have the option of being nvidia specific with the extensions mentioned above, but then there is GL_ARB_texure_env_combine which is a generic interface. Not as powerful however…
For interpolation, there is GL_ARB_vertex_weighting and you could do many neat tricks with vertex programs. Vertex_weighting is very easy to understand in contrast.
I recommend just reading the documents for the most part: http://oss.sgi.com/projects/ogl-sample/registry/
and the docs at nvidia & ati
examples are all on developer.nvidia.com as you possibly found out yourself yet. they have TONS of stuff on (houndreds of demos!)
Why do you want to use opengl?
There are only pixel/vertex shader equivalents for nvidia and ati hardware at the moment - both use different interfaces and syntax, whereas with DX it’s all unified into one (slightly less powerful) interface.
It really isn’t worth using opengl now - not until version 2 is released, anyway…(unless you’re working with linux/irix, in which case you’ll have to use opengl).
Thanks for your responses guys. Well, the project that I’m working on has the requirment of using OpenGL. I’m not worried about compatability, because it also dictates that a GF3 is the min spec.
I’m wondering, why OpenGL doesn’t have as unified of an implementation as DX8.x does (again, I’m a OGL newbie)? The vert programs look very familiar, however I’m still completely lost on how texture shaders, combiners, etc all fit into the picture.
The problem I keep running into on nVIDIA’s website is that I find it difficult to understand the sample stuff they have out there – they seem geared towards developers who already know how to use OGL well. I can’t find simple examples that define how all the peices fit together. Does anybody know of any online tutorials or books on the subject?
i’ll suggest you go for the .ppt files. how to get the interfaces (means the extensions) is your own job, but the nvidia files should help there (nvsdk for example) or else, well, its just grabbing functionpointers, so what?
for understanding how register combiners or texture shaders WORK, go for .ppt files. those presentations explain colorful and with sweet pics how they are designed. once you know that, you get the idea how they work compared to pixelshaders. i’m just sorry i don’t know anymore the name of the files (and as i have about 400 developer documents down, well… you get the idea )
but i thought at least one of them is in the nvsdk…
registercombiners and texture shaders are no easy task. but at least texshaders you should understand easy… its basically the same as the texture-fetching part at the beginning of each pixelshader (with the embm there in, or simple sampling (tex t0; i think).
the registercombiners then are the part where you calculate with the sampled colors to do the final result.
when I looked into the source code off nv_parse there is a code path the maps dx pixel shader to texture_shaders and register combiners (!!)
Also you can parse there a dx vertex shader which is mapped to NV_vertex_program
You should look at it. I dont’t know what dx functionality is handled, but when you look at it yu should be able to figure it out.
PS.: why not make DX as an extension to OpenGL : WGL_MS_directx8_1