I need to calculate the exact distance at which a sphere will be rendered as a predetermined amount of pixels. I figured I need to project an optical axis aligned circle on screen and calculate the distance in world coordinates based on the diameter of the projection. But this would have to be repeated many times until the projected diameter would fall beneath a certain threshold. What would be a more elegant, mathematical approach to solve my problem?

Convert “amount of pixels” to radius :

A=pi*r*r

r=sqrt(A/pi)

This is the projected radius.

Solve it with the sphere real radius and real distance:

r=R/Z

Z=R/r

There it is. Adjust depending on the screen scaling.

I don’t really understand how this is supposed to help. Let’s take a practical example:

viewport width = 800;

viewport height = 600;

FOV = 60;

near = 0;

far = 1000;

sphere radius in world space = 5

projected sphere surface = 1

r = sqrt(1/PI) = 0,5641895

Z = R / r = 5 / 0,5641895 = 8,86

So this basically calculates the coefficient that I could use to asses the screen distance of known object space distances that are at the same distance away from the camera. But what I am looking for is a distance from camera, at which my sphere will be projected as ‘A’ pixels.

s = viewportHeight / A

distance = (worldSpaceRadius * s) / (2.0 * tan(fov * 0.5 * PI / 180))