I’m currently implementing anti aliasing in some legacy code. I render to a FBO with GL_RGBA16F_ARB as the internal format.
To the FBO I attach 2 render buffers, one for color and one for depth. Both of these use glRenderbufferStorageMultisampleEXT(GL_RENDERBUFFER_EXT, numSamples, internalFormat, pixelsWide, pixelsHigh );
Where internal format for the depth is GL_DEPTH_COMPONENT24 instead of GL_RGBA16F_ARB.
All buffers are of the same dimensions, NPOT. GL_TEXTURE_2D is the target for the texture.
The FBO is blitted to a single sampled FBO that uses a texture for the color rendering. This texture is a regular RGBA8.
this Texture is then saved to file using DevIL.
I don’t get ANY Anti aliasing unless numSamples is 16.
glRenderbufferStorageMultisampleEXT(GL_RENDERBUFFER_EXT,16, internalFormat, pixelsWide, pixelsHigh );
I find this extremely weird and I really don’t get it.
Another issue is that the blit fail if I have a buffers width and height that are approaching 2k. I guess that is a memmory issue but I’m not really sure on how to debug this.
I use the latest GLEW to handle the extensions and a Nvidia 8600GT as hardware. The OS is Vista. Quite new drivers (a few weeks) CPU is Quad Core Intel.
Hoping for help in these dark times