how exactly should i go about doing this? everything i’ve tried seems to result in leakage.
As long as you delete the program object and the shader objects that are attached to the program you should not get any leaks. (Also ensure the render context that the objects are created on is active when you delete)
Also, how do you know you are “leaking” shaders?
Well, I am doing what you’re suggesting. I know (I think? ) that I’m leaking something because I’m “recompiling” them in the application (recompile, reattach, relink) after detaching/deleting and every time this happens memory usage increases a bit. I’m definitely not ruling out user error here, but it seems more and more unlikely due to the “apparent” simplicity of the process suggested.
Shameless plug here people, look away:
Perhaps you could run you app with GLIntercept running? (Download from here: http://glintercept.nutty.org/))
Then when copying the .ini file across, edit it and turn on :
Enabled = True;
(So you are not actively logging)
Then on shutdown, any shader leaks that are detected are written to gliLog.txt or if you are in a debug environment (like visual C++) the mem leaks detected are also dumped to your output log.
The glslang spec requires that drivers allocate lots of client memory for shaders and linked programs. They have to keep copies of all the strings you pass in.
As such, it is entirely possible that the driver you are using has forgotten to delete something. It could be a driver bug.
I just wanted to second the plug for GLIntercept. It’s many things GLTrace never turned out to be!
thanks to all the replies, i’ll definitely try glintercept.
Speaking of GLIntercept … sqrt[-1], when is 0.4 due?
Little thing called “Doom3” chewed up all my free computer time. Back working on it now…
I tracked MY error with the first usage of GLIntercept, so I can easily suggest it.
This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.