Depth testing when z `EQUAL` the clear value

I’ve run into inconsistent behaviour in depth testing and I don’t know if this behaviour is specified anywhere.

My question is: when using EQUAL is there an expected/specified behaviour if a primitive’s depth is explicitly set to the same value the depth buffer’s clear value? Or is EQUAL intended solely for comparing rendered fragments only (not including “empty/clear” areas)?

If you create a WebGL program with the following configuration:

gl.enable(gl.DEPTH_TEST);
gl.clearDepth(1.0);
gl.depthFunc(gl.EQUAL);

…and then in in your vertex shader specify:

gl_Position.z = 1.0;

…would you expect the program to render anything?

On my M3 Mac and a Mali-G52 GPU primitives are rendered, but on a Snapdragon 685 they are not. On further investigation using a floating-point calculator I found that using gl.clearDepth(0.9999828338623046875); does work on Snapdragon, suggesting that clip-space depth is clamped or scaled on these devices to less than 1.0 during depth testing operations. I’ve confirmed in the fragment shader, after depth testing, that the fragment depth is indeed 1.0.

That weird value (0.9999828338623046875) requires 18 bits of mantissa, which is greater precision than depth16unorm or half-precision float (admittedly, I have no clue how double-precision JS numbers are converted to normalized integers in WebGL).