does anyone know what fov (hor or vertical) should be used in opengl (eg gluPerspective) to show the eqivalent field of view that the human eye uses.
Or should i not use that, but stick to a “typical” FOV of around 45deg as suggested to me ?
The application is a simple 3d model loader and viewer
The field of view for the eye can be as high as 180 degrees. Not that you will see much in the edges, but you can see that something is happening (movements for example), but maybe not identify it. Using a near 180 degrees FOV in gluPerspective/glFrustum will give you horrible perspective distortion since the near plane is, well, a plane.
What the FOV should be to simulate the view we see from our eyes on the screen, the FOV could be calculated from the physical size of the window and the distance between your face and the monitor (the angle between the left edge, your face and the right edge of the window). That will give you the FOV as we would see the scene if it really was behind the monitor, and the window was a hole in the monitor, so to speak
In my case, the FOV for a fullscreen window is about 25 degrees. A 90 degree FOV, for example, would mean I literally have to press my nose to the monitor to get the correct angle between the window edges and my face. I my oppinion, a good angle without much perspective distortion is about 60 degrees, but anything less that 60 works also, although the FOV gets a little bit too small to get a good overview of the scene.
You have the wrong approach. For correct model viewing the field of view is determined by the viewer’s relationship to the screen.
Look at the screen in front of you. Now imagine a frustum defined by the bridge of your nose and the edges of the screen. If you consider a full screen application it is easy to see that there is only one correct frustum for each head position in front of your screen.
Now, almost nobody actually does this kind of thing for desktop apps but they don’t try to use very wide fields of view either because you’re so far away from a correct projection (unless your nose is touching the screen) that things will start to look distorted. Something like 60 degrees is probably appropriate, but experiment or even make it flexible.
^^ exactly what Bob said, no ?
Uh… yea sorry Bob
I read his first paragraph and stopped, but he’s spot on.
The only thing I would suggest that’s slightly different is that the eye defines a frustum with the screen not merely a FOV (and FOVs tend to be symmetric defined as full horizontal FOV not half FOV although you need to compute as half and double to get the atan right), a frustum may be asymmetric which can give you better projection geomerty, not important for most desktop apps but can be significant for more advanced screen setups.
For a simple application, I think taking it as far as using the physical distances to calculate correct FOV is a bit too much, just as you imply in the end of your post. If it really matters, one can take it a step further aswell, and reshape the frustum as the head moves (using some tracking device). :rolleyes:
But it’s always good to know why perspective distorion can occur. Spooky said it was “a simple 3d model loader a viewer”, so I doubt real frustum shape calculations is important.