Possibility to query shader for resource usage?

(Sorry if this has come up before, but I couldn’t find anything on Google, nor the (throttled) search function on the forum).

I know I can query the hw for max number of interpolants (vp->fp variables), but how do I verify that my shaders don’t overuse the resources (without actually running them on a min-spec hw?)

We hit a bug today due to this, and my dev system is fancier that some targets, so I’d like to find this as soon as I write my shaders.

I realize that with a scalar architecture (i.e. G80) the result may differ vis-a-vis a “classic” vectorized GPU, but is there some way to get at these things at all? (Proprietory and whatnot?)

I could of course write a GLSL parser myself, but …

Do you need to check at runtime, or just want to check how they turn out on the hardware side of things? If the latter, then you can use a tool like GPU Shader Analyzer (ATI cards).

Runtime preferrably,

The idea was to integrate it into a testing mode (together with GLSLValidate, which we already use) to make sure all shaders conform to our specs.

Manually checking it via a GUI application is not really an option, since we can’t enforce such a policy, and also because we do some text processing (#include’s and more) so the complete shader is only available at runtime.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.