Deferred shading with GLSL

hi,

how to do deferred shading with glsl ? i need to store 7 components in a 64 bits color, the two firsts in the R component, the two seconds in the G component, etc… But i don’t know how to pack them and how to unpack them. To have a 64 bits color buffer i use a simple pbuffer. I use ATI 9800 pro and nvidia fx5900xt cards.

thx.

You can do packing on NV only. Make a 128 float target and pack 8 16 bit half float numbers in there via PK2H

For ATI, I assume MRT (multiple render targets) would solve this but I think they haven’t exposed this yet.

MRT has been exposed for ages.

http://oss.sgi.com/projects/ogl-sample/registry/ATI/draw_buffers.txt

MRT has been exposed for ages.
And what about the glslang API?

This extension (ATI_draw_buffer) will be included in OpenGL 2.0 as an ARB extension. So it “isn’t vendor specific”.

Originally posted by Korval:
[quote]MRT has been exposed for ages.
And what about the glslang API?
[/QUOTE]Take a look at the new GLSL 1.1 specification. There are new built-in output data types for the fragment shader: gl_FragData[i].

Sorry about the bad info on drawbuffers. You can use it via ARB_vertex_program today so it looks like it’s possible.

And for the NV40, it supports ATI_draw_buffers (emulation on previous).

And when GLSL 1.1 support is added …

Originally posted by Corrail:
This extension (ATI_draw_buffer) will be included in OpenGL 2.0 as an ARB extension. So it “isn’t vendor specific”.
What GL 2.0?

http://www.opengl.org/about/arb/notes/meeting_note_2004-03-02.html

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.