Is it possible for GLSL to have some language structs for rendering state modifications? Rasterization state in geometry shader (or in VS in absence of a GS/Tess) or SampleCoverage paremeters in FS or Blending parameters in FS etc.
I am just trying to imagine HLSL modifications what it may look like in GLSL. I don’t know if it is possible in GL pipeline.
I don’t know if it is possible in GL pipeline.
If HLSL can do it, then GLSL could do it as well. The hardware ultimately determines what’s possible and what’s not. Don’t know how much effort it would take though.
I don’t see any particular need for this. It doesn’t give you any power you couldn’t get yourself. And it doesn’t even make the code that much more convenient. While some shaders do rely on being used with certain state, there are many that don’t care if blending is on or off.
This is something that would better be done as a layered system, like D3D’s FX files and such.
Neither does stuff already in GLSL (attribute locations declared within GLSL). Additionally, it would like make life a little easier, since the above lets the GL state be specified in the shader code, avoiding the need to create a system where one would need to do that oneself. On the other hand, Alfonse does have a point, that lots of shaders are used in different GL states… but still lots of shaders are used in very particular states. Also, and this is important, some GL implementation do peculiar things with GL state changing (for example in embedded land TEGRA’s blending is handled not by a RasterOp but by extra code added to the of the fragment shader) so the above might make some things less icky for such shader on such hardware.
HLSL doesn’t actually do it. This is managed by the Effects Framework in D3D, which is a wrapper around HLSL and just makes regular state change calls behind the scenes (you can verify this yourself by running a program under PIX). No reason why something similar couldn’t be written for OpenGL, but it wouldn’t be part of GLSL.
HLSL doesn’t actually do it. This is managed by the Effects Framework in D3D, which is a wrapper around HLSL and just makes regular state change calls behind the scenes (you can verify this yourself by running a program under PIX).
Never used D3D up until now but I will. If it’s simply an API call wrapper it seems it’s not deemed feasible then.
I disagree. Explicit binding of attributes, fragment outputs, texture units, and buffer object binding points is very convenient (indeed, this is exactly why I put that “much more convenient” clause in there). It makes it easier to establish a convention based on the actual OpenGL resources in question (ie: indices) instead of string names. That way, each shader can use the string name that it feels is appropriate, all while adhering to an established convention for interfacing.
The biggest difference in terms of convenience is one of how often it would be used. One can imagine every shader in many applications using explicit binding. It would be difficult, though not impossible, to imagine circumstances where every shader in an application was directly associated with some OpenGL state.
Also, don’t forget that explicit attribute and fragment output setting is basically required if you want to use glCreateShaderProgram
Yes, but in-shader state setting does not preclude out-of-shader state setting too. So the implementation would still have to deal with it if the user doesn’t want in-shader state setting.
Since that would be a hardware-specific issue, it would be better for it to be implemented as a hardware-specific extension. Indeed, the extension would be better, because if the blending is handled by the fragment shader, then theoretically the fragment shader could do anything, not just the limited blend func operations that GL defines.
Thing with D3D effects is that in practice they quite quickly become a total pain in the ass to manage. They’re fine for a simplified wrapper if you’re writing trivial stuff, but as soon as you want to do something as basic as use different states with the same sets of shaders you hit a wall - think of it as being like the old linked programs model in GLSL but with further constraints and awkwardness added. Most people move to using shaders and states directly, with explicit registers being set for textures, samplers and constants in HLSL code.
It’s worth noting that with D3D11 the Effects Framework is no longer supplied in pre-compiled .lib form with the SDK - you do get it as source (with an incredibly liberal license) so you can compile it yourself if you really want to use it, but it seems a pretty good sign that it’s moving to deprecation-land. State objects and cbuffers make it’s use quite redundant anyway.
GLSL should definitely not go down that route - even in wrapper form. It would be just repeating the old linked program object mistake; it’s an opportunity for the ARB to look at a similar mistake made by D3D and learn from it. Far better to be thinking about exposing programmability in some of the remaining fixed pipeline stages (assuming hardware support) instead.
The bottom line is that this is like someone saying Direct3D can load jpeg, png, tga, dds files.
Then someone comes along and says Direct3D doesn’t do that. It is the D3D Utility library.
For GL, you would use DevIL or FreeImage or some such 3rd party library to load your images.
The same thing applies here. Is there a 3rd party library that does “graphics states + shaders”. YES, there is : CgFX from nVidia.
Except that’s not cross-vendor.
mhagain: Last time I skimmed through tech demos by NVidia they were using effect files. Do you know if the big ones already abandoned them altogether by now? Just curious.
Effects files make perfect sense for tech demos. Those are supposed to be rather simple and short bits of code, just enough to show off something snazzy. Even if they still use them, that doesn’t mean that effects files are something that should be incorporated into GLSL.
Even if they still use them, that doesn’t mean that effects files are something that should be incorporated into GLSL.
Agreed. I wasn’t suggesting otherwise.
[QUOTE=thokra;1237818]Except that’s not cross-vendor.
mhagain: Last time I skimmed through tech demos by NVidia they were using effect files. Do you know if the big ones already abandoned them altogether by now? Just curious.[/QUOTE]
The DirectX SDK doesn’t use effect files in it’s D3D11 samples for sure; I can’t speak for NVIDIA’s reasons.