[AMD] GL_DEPTH32F_STENCIL8 stencil texture garbage

I have the exact same problem as reported here, only 7 years later (actually, two problems)
https://www.opengl.org/discussion_boards/showthread.php/173325-Reading-Depth32F-Stencil8-combined-buffer-in-shade

It only shows on AMD hardware (RX 560).
We have a doom3-based 2-stage deferred renderer.
It works more or less OK with a basic lighting. Now we want to add two textures to fragment shader: depth and stencil.

With GL_DEPTH24_STENCIL8 it works with halved FPS.
With GL_DEPTH32F_STENCIL8 FPS is back to normal but the stencil texture is corrupted.

It looks very much like driver bug but hopefully there must be a workaround for it.

Please advise.

Maybe try to use separate depth and stencil via GL_DEPTH_COMPONENT32F and GL_STENCIL_INDEX8? Just note that GL_STENCIL_INDEX8 only became core somewhere during 4.3.

Separate 24-bit depth and 8 bit stencil aren’t supported anywhere but Intel. Why do you think separate 32-bit depth and stencil could be supported?

I get GL_FRAMEBUFFER_UNSUPPORTED with
glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT32F, width, height, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0 );

glTexImage2D( GL_TEXTURE_2D, 0, GL_STENCIL_INDEX8, width, height, 0, GL_STENCIL_INDEX, GL_UNSIGNED_BYTE, 0 );