My general feeling with GL would be that this is actually intended behavior, but as you mentioned it i can find any text that would allow implementation to return garbage (or zero) from textureSize when the texture is incomplete. This seems counter intuitive for me, as it essentially requires the driver to pass an array of uniforms to the shader (i.e. this information probably can’t be taken from surface setup that hw needs).
If this is a bug, then id be willing to bet its a spec bug instead.
If a texture is incomplete, you cannot get any information out of it. This applies to both texels, size and other stuff.
Well, that’s just your assumption. The hardware is free to get the size information from anywhere. Yes, it might be an internal array of uniforms, but in practice, it’s very likely that the hardware actually has this information from the internal data structure that corresponds to a sampler uniform.
Yes, this all is based on assumption that textureSize has to work with incomplete textures, and frankly I would be very surprised if anyone implemented it that way.
Its highly improbable that HW would support specifying misbehaved textures (as allowed by GL). Say, setup a surface that has first level of size 1x1 RGBA8 format, second level is empty an third level is 100x1 DEPTH_COMPONENT8. Now HW could be made to ‘support’ textureSize on such abomination, but it would probably not be very good idea - i assume that serious HW vendors don’t waste silicon to support idiotic cases written (?) in GL spec. Given this assumption pretty much only way to do that would be by using uniform arrays but this is wasteful and again not very bright.
By what wording? I see only this in core spec:
Incomplete textures (see section 3.9.14) are considered to return a texture source color of (0, 0, 0, 1) for all four source texels.
Which handles all sampling functions. There are 2 other functions though that have to do with samplers: textureSize and textureQueryLod. textureQueryLod is explicitly allowed to return garbage on incomplete textures (as per GLSL spec), textureSize is not.
There is nothing more regarding incomplete textures in both specs afaik. Do you see something else relevant to this case?
Then I think it’s most probably a specification bug. Actually, this is not the only specification bug in the “Texture Size Query” section of chapter 2.11.7. If you look closely, it says in the last sentence that the “layer index” is returned for array textures, but it should say “number of layers” instead.
I’m just saying, I wouldn’t expect any proper behavior from the shader’s point of view if you try to do anything with a sampler uniform that references an incomplete texture. It may be a specification bug, but I don’t think it’s a driver bug.