Shaders on ATI Radeon X600 SE

After giving up with ATI Radeon 9550, a customer of ours first complained about slow speed, then after upgrading the graphics driver (on Windows) the shader doesn’t work anymore as it happens on the Radeon 9550.

We are now considering to swap the blur shader with a software blur algorithm.

Why are shaders so DIFFICULT ?!?!

What can we do to avoid this situations ?!?!



If you post the full shader sourcecode, and usual uniform values, maybe we can help with fixing it.

I guess it would really helps to keep several PCs with different ATi cards’ generations: X300 (SM2.0), X1950 (SM3.0 iirc), HD2400 (SM4.0). And one PC with whatever nVidia card, get nV_emulate, which lets you emulate all nVidia generations of cards.

Shaders on all nVidia cards are a piece of cake, btw :P.

Also, make sure to dump the shader-compilation/linking error-text into a text-file, that users will send you.

… the shader doesn’t work anymore as it happens on the Radeon 9550.
what exactly is the problem?
i remember compiling my shaders fine on nvidia then trying out on ati turned out that i had to specify floats like 0.0 instead of the integer version 0. (f.e.)

What can we do to avoid this situations ?!?!

Under OpenGL? Nothing. Outside of testing every piece of hardware with several driver revisions. And being prepared to update your product every time ATi releases a new driver (once a month).

Unless you’re using the exact codepath that major OpenGL games (idTech engines) use, you should consider your code “experimental” on any ATi product under Windows.

That’s the reality of OpenGL under Windows today. And it does not look likely to change.

just remembered, write and check your shaders in render monkey.

and software blur algorithm will be probably even slower.

Maybe, just maybe - precompiled ARB asm shaders will work?? (compile with cgc.exe)

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.