Dual src blending on AMD, driver issues?


I try to port a Dx application to opengl (linux) but I got several issues with dual blending.

1/ Failled to set correct index from GLSL (in separate shader context). Test was reproduced on fglrx 12.3 and fglrx 12.4 (both linux)
In my glsl I set the index with layout qualifier.

// Same buffer but 2 colors for dual source blending
layout(location = 0, index = 0) out vec4 Target0;
layout(location = 0, index = 1) out vec4 Target1;

When I check the index with glGetFragDataIndex. I got

Frag0 index 0
Frag1 index 0

I try to use glBindFragDataLocationIndexed instead. It seems to work, at least glGetFragDataIndex returns the correct index. Unfortunately I still have issue on blending

2/ It seems the index1 value is not computed correctly. Test was reproduced on fglrx 12.3
I reuse a sample of g-truc (330-blend-index) that I hack to do several test. I output a constant color in the shader, and the same for both index. For example a red quare (1,0,0,0) or (1,0,0,1)

If I used a single program (not separate shader), everythings is good.


If I used a separate shader infrastructure, it doesn’t work anymore. For test purpose I replace SRC1 by SRC (rembember both color outputted are the same), then validate the correctness of the output. So we can conclude than the 2nd color of the blending unit is undefined although both colors are set in the shader.

I didn’t find any information of any interaction between blend_extended and separate_shader_object extensions. Maybe I miss somethings.

I can attach a standalone test program (linux glut) or an apitrace’s trace for 1/
For issue 2, I can send apitrace’s trace. It would be too complicated to extract the hacked program from g-truc sample. I could still develop a new one.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.