I’m not entirely sure what I want so this is going to be a vague question…
I want to render a scene (to a texture) and then have the hardware sum all the pixel values to determine the overall brightness of the scene from that perspective.
Is it possible to do this entire on the GPU? Or do I have to read the texture back to system memory and use the CPU?
Any ideas how this could be done?
Note: I’m not very experienced with much beyond OGL 1.3 & VAR.
It’d be pretty cool if there were a way to do this, but there’s not.
If all you need is the approximate sum, you could try this:
- render your scene
- copy the scene to a texture that has automatic mipmap generation turned on
- render something that has the smallest (1x1) mipmap of this texture applied to it.
- read back a pixel from this rendering
- multiplying this pixel’s color value by the number of pixels in the original scene will give you an approximation of the sum of the values in the original scene.
You should compare this method to the exact answer you get from reading back the entire texture and using the CPU to sum the colors and see if the error is acceptable to you given your typical scene and what you want the sum for.
I do something the same only different.
I render to a texture and then subsample this texture by a factor of two. If you make sure that your texture coordinates are right then every pixel should be the average of four around it, using linear filtering.
You keep doing this until you get down to a 1x1 texture and that should be the average of the whole scene.
The mipmapping approach sounds good, although i’m not sure how it produces the pixel values at the lower level. I thought it does some filtering other than averaging to give nicer results. Maybe not.
This way may give more accurate results although the mipmapping is probably a lot faster.
I use brute’s method , and it works fine.
This way may give more accurate results although the mipmapping is probably a lot faster
Well, both methods are going to be off by quite a bit. After all, using our methods will only give you 256 possible pixel sums.
To make it worse, the difference between possible sums will be the number of pixels in the scene. Doing the sum exactly would allow for nPixels * 256 possible sums which would differ by only 1.
This accuracy problem could be avoided using floating point textures, but I don’t think those support mipmaps right now.
Thanks for all the help guys!
You’ve all given me much to think about.
I’m sure it’s obvious that I asked this for radiosity reasons… I’m currently writing my renderer in software with thoughts of implementing in hardware…
Anyone familiar with a harware demo around?
It would be nice to see what others have accomplished.
The only radiosity demos I have seen can be found here. http://www.volny.cz/redox/
I am working on radiosity in hardware. I haven’t released a demo but here is a short animation.
Here’s another radiosity thread that you might find interesting. http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/008703.html