# Water depth changes with camera position/rotation

Hello all,

I am trying to get the water depth for a water plane situated at y = 0. So far im doing the following:

1. Render Reflecion and Refraction maps, storing the depth of the refraction map in a texture.
2. Render the water plane using the textures as an input. This is how i calculate the depth in the fragment shader:
``````
vec2 texCoordsReal = clipSpaceToTexCoords(textureCoordsReal);
vec2 texCoordsGrid = clipSpaceToTexCoords(textureCoordsFlat);

vec2 refractTexCoords = texCoordsGrid;
vec2 reflectTexCoords = vec2(texCoordsGrid.x, 1.0 - texCoordsGrid.y);

float waterDepth = calculateWaterDepth(texCoordsReal);

``````

with the following functions:

``````vec2 clipSpaceToTexCoords(vec4 clipSpace)
{
vec2 ndc = (clipSpace.xy / clipSpace.w);
vec2 texCoords = ndc / 2.0 + 0.5;
return clamp(texCoords, 0.001, 0.999);
}

float calculateWaterDepth(vec2 texCoords)
{
float floorDepth = texture(depthMap, texCoords).r;
float floorDistance = toLinearDepth(floorDepth);

float waterDepth = gl_FragCoord.z;

float waterDistance = toLinearDepth(waterDepth);

return floorDistance - waterDistance;
}
``````

and the input coordinates are calculated in the vertex shader as:

``````textureCoordsReal = uMVPMatrix * vec4(position, 1.0);
textureCoordsFlat = uMVPMatrix * vec4(position.x, 0, position.z, 1.0);
``````

The problem with this is that as soon as the camera moves or rotates, the water depth changes, meaning that the water depth decreases if i move the camera away from the water and it also moves if i rotate the camera sideways.

As this is calculated it seems to me that the floorDistance - waterDistance should always be the same independently of the camera position/rotation. Am i wrong in this assumption?