Dose mali GPU supports GL_R16?

Hi

I have an offline shadowmap pre-stored in Texture with format GL_R16, and i want to use sampler2DShadow to implement PCF. It works all fine in Adreno, but goes wrong in mali.Can anyone tell me the reason? and what format of Texture can also support sampler2DShadow?I tried GL_R16F,it not work even in Adreno.

Not sure about the specifics on Adreno, Mali, or OpenGL ES in general.

But traditionally in OpenGL IIRC, sampler2DShadow and hardware depth texture compares (GL_TEXTURE_COMPARE_MODE = GL_COMPARE_R_TO_TEXTURE / GL_COMPARE_REF_TO_TEXTURE) for texture lookups were only supported for a texture that has type GL_DEPTH_COMPONENT or GL_DEPTH_STENCIL) (so a texture with genuine depth). Browing the OpenGL 4.6 spec, I think that’s still the case (see section 11.1.3.5 Texture Access).

UPDATE: Yep. Browing the OpenGL ES 3.2 spec, it appears that the same is true there. See section 8.20.1 Depth Texture Comparison Mode. You’re not even in “undefined behavior” land there. It explicitly says you won’t get texture depth compares with an incompatible texture format:

Hi ,

If so, Can I store Texture data as format GL_DEPTH_COMPONENT or GL_DEPTH_STENCIL? I thought the two are format for frame buffer.Is it possible?

For depth? Sure. Any internal texture format with a base internal format of GL_DEPTH_COMPONENT or GL_DEPTH_STENCIL which is supported by your hardware. For instance, see the ones listed in Table 8.11 in the OpenGL ES 3.2 spec:

  • GL_DEPTH_COMPONENT16
  • GL_DEPTH_COMPONENT24
  • GL_DEPTH_COMPONENT32F
  • GL_DEPTH24_STENCIL8
  • GL_DEPTH32F_STENCIL8

Hi,

I checked the table, but still be confused about how to store depth in texture rather than frame buffer. Maybe I should stored data in unsigned short and use following API?
glTexImage2D(
GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT16, 800, 600, 0,
GL_DEPTH_COMPONENT, GL_UNSIGNED_SHORT, TexData);

Just look for one of the many online or print tutorials on doing Shadow Mapping with OpenGL. They show how to create a depth texture, bind it as the depth buffer of a framebuffer object (FBO), draw a scene to it, and then bind that same depth texture to a shader sampler so they can read the depth values in it in the shader in a subsequent render pass that applies the generated depth/shadow map to the scene.

It should basically be exactly what you have now – but just plugging in a proper depth texture in place of the GL_R16 texture.

For instance, here’s one from LearnOpenGL.com:

If you read a depth texture value into the shader via a standard sampler2D sampler (with HW depth compares disabled), then you get normalized 0…1 depth values back. Of course you can instead do what you’re trying to do and let GL do the depth compares (and possibly additional PCF filtering) behind-the-scenes by using a sampler2DShadow and enabling HW depth compares.

Hi,

The problem is that the shadowmap is pre-maded once offline,which means I want to store depth in a texture like png or jpg, and read it like GL_DEPTH_COMPONENT16, I am not sure how to do it.

Once you get it in CPU memory, just upload it to your pre-created, pre-allocated depth texture using glTexSubImage2D().

Storing it on-disk and reading it isn’t an OpenGL question. But… Ditch JPEG. Lossy. Consider KTX. It’ll store any texture you can represent in OpenGL:

As far as other formats… If you only need 16-bit greyscale, PNG will handle that. For 32-bit float or 32-bit int textures, take a look at OpenEXR. Doubtless there are other predefined formats to choose from. But I’d lean toward KTX since it’s oriented toward storing all OpenGL and Vulkan textures.

For initially for testing though, just roll your own and make it simple.

Hi,

I will try as you said!Thank you for your help~

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.