I modified the “Render to a Texture” tutorial at GameTutorials.com to render the depth buffer to the texture instead of the colour buffer. For good measure, I decided to incorporate the ARB_depth_texture extension. Normally the thing ran at more than 700 fps, but after my modfications I got 8 fps. After a bunch of fooling around I found that turning off FSAA resulted in the correct framerate of >700 fps again. Does anyone know why this occurs?
I’m using a Radeon 9700 Pro with the Catalyst 3.0a drivers, BTW. . .
Sounds like a driver bug. I don’t get that behaviour though, I see only a small performance decrease by enabling FSAA, like 156fps -> 142fps.
Which drivers are you using?
It could be the framebuffer format vs the texture format used. If they are incompatible there might be some expensive swizzle going on, perhaps even in software back on the CPU.
Where’d you find that driver, Humus? I’m using 6255, which is the latest version on ATI’s site. . .
Well, it’s up on the developer site, but has been leaked for almost as long as it’s been up there, a quick google search on “ati 6275” gave me several links.
Ah. . . those. . . Thanks!