Tools/IDE for authoring GLSL shaders

Well, I was actually thinking more of third-party ISVs…

If there was money to be made on such tools, then NVIDIA and ATI would already have provided them. Since no company sees a need to put any effort behind it, that leaves us with community-provided tools.

These are generally prone to people abandoning them. And most of them are more interested in making applications that use OpenGL than making the use of OpenGL easier.

One problem to implement such tools, is that it requires very low level access to the hardware.

So either the graphics manufacturers provide the access to the ISVss, even if one has to pay for the specifications, or implement the tools themselves.

True. Unless you implement shader debugging in software emulation mode. I do understand that this approach has its drawbacks
but it has its (quite significant) advantages as well.

  1. No driver dependency (thus much more stable and not vulnerable to the driver bugs).
  2. Debug shaders even on hardware that doesn’t support them natively.
  3. No need to have “low level” hardware access.
    And generally speaking - unlimited flexibility.

And if this software implementation sticks very closely to the GL/GLSL standard I believe
it has a lot of value.

These as well might the reasons why M$ implemented shader debugging in software
but this is just my guess, of course.

glslDevil’s approach (http://www.vis.uni-stuttgart.de/ger/research/pub/pub2007/gh07-strengert.pdf)
is quite different though. And seems to be much more complicated (may be even fragile) too. As you have
to care about syntax correctness of the instrumented code and side effects that such an instrumentation
may have on the code execution might be driver dependent…
Of course - one big advantage that it has is that it actually lets you do “real hardware debugging” as
opposed to software emulation.

Are you sure about this? I was expecting that PIX works at the card/driver level.

Nexus and PefHUD work at the driver/card level as well.

At least for Nexus I know that you have even to use two PCs, because the debugger can “Stop the world”, which conflicts with the graphic card usage on Vista/Windows 7.

I like OpenGL a lot and find the API much more programmer friendly than the DirectX one.

But I am only a hobbyist in what concerns graphics programming, so I don’t have any issue switching to the dark side, like many already did. But I would rather stay with OpenGL.

But the lack of proper tools on this day and age is really a problem for OpenGL.

Intel for example is really bad in what concerns OpenGL. We are all well aware how bad their integrated chips are, specially the quality of their graphics drivers.

Now they started a PR campaign to get positive feedback about their offerings, and help developers make their games run properly on their chipsets. For that they created the Intel Graphics Performance Analyser (GPA).

http://software.intel.com/en-us/articles/intel-gpa/

It only supports DirectX.

I attended the tool’s presentation at the GDC Europe, and when I asked about OpenGL support, the answer was “Give us your card and we will get back to you”. Which they never did, because, hey I am just a hobbyist.

I recall numerous sources stating that shader debugging is implemented as software
emulation in PIX but at the time I can only point you to the glslDevil’s whitepaper.
Here is an excerpt:
“The first group includes Microsoft’s Direct3D profiling
and debugging tool PIX [Mic07] and a number of OpenGL
state machine debuggers, namely spyGLass [Mag02], Bu-
GLe [Mer04], GLIntercept [Tre04] and the commercial
gDEBugger [Gra04]. While all of these tools provide the
ability of API call tracing and logging as well as breakpointing
(PIX and gDEBugger further allow to display various
performance counters and other proling information), only
the PIX tool provides the possibility of shader debugging.
However, for shader debugging PIX relies on the software
emulation of the Direct3D reference rasterizer, i.e. no actual
hardware values are debugged. In case of OpenGL some
of the above mentioned solutions provide Edit&Continue
shader editing, but currently no tool that we know of features
a full-fledged shader debugger…”

I like OpenGL a lot and find the API much more programmer friendly than the DirectX one.

So do I. I find GL specs very detailed, complete and overall API is very consistent from one revision to the next. That’s why I prefer to stick to OpenGL. As I don’t have strict deadlines I can afford longer development times (due to lack of tools and support) still hoping that situation would turn around.

Anyway, the point I was trying to make is that it is possible to make full-fledged GLSL development environment (even with the current state of support from hardware vendors), either in software emulation mode or true hardware mode (as prooved by glslDevil). Though it would probably be much
easier to do it had hardware vendors provided lower level access to the graphics hardware.
And the community would seem to appreacite such an IDE very much…

About software debugging of GLSL, I think mesa has tools, but I never investigated more.

So how do you do printf from a shader ? The only debugging technique that I managed to employ is changing the color of a fragment or a vertex, which is just a tool to kill interest in any GLSL programming.
Needless to say my productivity debugging shaders is literally hundreds times slower than debugging any code that is executed on the main CPU.

Well, printf-debugging does not really mean calling printf() to output anything anywhere :slight_smile: (there isn’t even any printf() function, there are no output streams to connect to). printf-debugging means that you output the value of interest through the gl_FragColor fragment shader output variable. And that (as I understood) is what you already doing…

Though the process of printf-debugging is quite tedious it has its strong side: it gives one a perspective view at how the target value changes across the surface of the rendered primitives helping to quickly pinpoint any incorrectness in the calculation. Something that is hard to notice when performing isolated pixel debugging only (using a debugger)…

Here’s the “printf”-style I use:
http://dl.dropbox.com/u/1969613/openglForum/debug1.jpg

The debug-text is changed every N=100 frames.

Can debug vertices by using the “flat” modifier in varyings.

Yeah - this is more powerful way of ‘printf’-debugging :slight_smile:

Breakpoints and such can be easily improvized with the MRT method.

And how do you improvise breakpoints? By capturing the snapshot of the value of the variable of interest into its MRT slot at a specific position in a shader?
Still, as I understand this way of debugging requires modification to the GL setup code (need to setup MRTs). With gl_FragColor you only need to make very minimal changes to your shader and you are ready to go…
Or do you have some kind of framework (debug layer) that handles the debugging setup making it all transparent?

Yes, I capture a snapshot of the value to improvise breakpoints. I don’t have a framework, the manual patchwork generally requires 3-10 seconds, and with F5 my code can recompile all modified shaders without exiting the app, to start monitoring another var or try other calculations.

Well, I agree: this is probably the most comprehensive way of GLSL debugging (especially combined with the varyings “flattening”) - given the lack of actual debugger.
Thank you for the good ideas!
1 question: GLSL does not allow to output values to MRTs and gl_FragColor in the same shader. But it seems to me (from the screenshot you posted above) that framebuffer contains the actual end fragments (not the debugged values). Do you always do rendering to N render targets (FBOs) outputting let’s say the
end fragment color to RT#1, the debugged values to RT#2, …, RT#N and visualizing RT#1 in the end?

With my gl-wrapping libs, changing between i.e 1 RGBA8 MSAAx4 target to i.e 3 different-format non-aa is as simple as:


IL_TEX_FORMAT formats[]={ILTEXFMT_RGBA8};
	fbo = ilCreateFBO_MSAA(SCREEN_WID,SCREEN_HEI,1,formats,ILTEXFMT_DEPTH24,4);


IL_TEX_FORMAT formats[]={ILTEXFMT_RGBA32F,ILTEXFMT_RGBA32F,ILTEXFMT_RGBA16F};
	fbo = ilCreateFBO_MSAA(SCREEN_WID,SCREEN_HEI,3,formats,ILTEXFMT_DEPTH24,1);

Btw,
http://developer.amd.com/gpu/PerfStudio/Pages/default.aspx
I don’t have a Vista or Win7 PC around to test it, plus it’s only GL3.0 for now.

I am searching for a shader debugging/developing tool too. This thread seems helpfull but I can’t figure out what the best choice seems to be for now. RenderMonkey?

Ilian since I am running windows 7 I downloaded and checked that one. It is not bad since it offers frame profiling and debugging. But unfortunately the shader debugger is only for HLSL and DirectX :frowning:

Sorry for double posting but I really feel the need to praise two of the pieces of software mentioned in this topic.

The first is Render Monkey, which even though does not allow you to debug the shaders it allows you to visualize them on various models and through various passes. Rendermonkey allowed me to confirm that there was nothing wrong with my shader, since it did have the intended effect on the models there.

The second one is glslDevil. That’s one nasty piece of software. Thanks to that I managed to run my shaders line by line and figure out that all the vec4 uniform variables had not been initialized correctly due to my incorrect usage of glUniform4fv in my main openGL program.

Bottomline is , thanks to the people who posted in this thread, it helped me find 2 great tools.

P.S.: Thankfully I have not come across any problems with glslDevil yet. Everything ran smoothly on my machine, from GLTrace to shader debugging. My machine runs Windows 7 ultimate edition, equipped with Nvidia GTX260

glslDevil just doesn’t work for me.

The best we could do so far with it was to see vertex and fragment source code.

The debugger buttons are always greyed out.

We tried various projects, and multiple system configurations.

It just doesn’t work.

  1. Make sure that you system meets the system requirements for debugging shaders: your video card must support NV_transform_feedback for vertex shader debugging.
  2. Check that there are no OpenGL errors during the rendering.
  3. Make sure you render your geometry using the debuggable calls.

Also make sure that you use the latest version. The latest version (1.1.5) is much more stable than the version I described in the original post (1.1.3b).

An update to my original post (at the start of this thread):

glslDevil works very well for me now (all the problems I described in the original post seems to be fixed) and it has now become a critical tool in my toolset.
I decided to give it another try recently as I was struggling with one of my shaders and it just helped me to understand and fix a very-hard-to-detect-bug that I was helplessly trying to fix using printf-debugging for weeks!

Frankly: saves me a lot of time and I can hardly imaging developing my shaders without it any more…

Hi,

I did manage to make it work, and everything works as expected.

One thing disappoints me though: it doesn’t seem to work on Radeon HD hardware.

I understand Geometry shader debugging is supported just the same way as Fragment shaders. But when I click the debug button with a Geometry shader on a Radeon HD, I get a GL_INVALID_OPERATION error, and glslDevil displays a dialog box saying “Could not debug shader, an error occured”.

I’m disappointed because I expected Geometry shader debugging to work fine with ATI cards. I don’t see nVidia anywhere in glslDevil requirement list for this purpose.

Cheers
Fred

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.