# Displaying objects like you would see them in real-life

hi,

i am developping an application where you can move around in a room like you would in real life.

My problem is now that it doesnt look “real”. I need some help with setting the open-angle for the view (or more specific gluPerspective).

At the moment it is possible to move around in the scene. Now the problem is i position the camera at lets say y = 2.0 (someone being 2 meters high) and look at some object which are 5 meters away and 1.5 meters high and it doesnt look real. I think the problem is with the open-ratio of gluPerspective. Does anyone have experience with setting it to a value so the scene looks real?

Well, making things looking real is a really difficult task.
But your problem is probably in wrong values for fovy and aspect.
Fovy for the human eye is about 41 degrees, and aspect depends on the window with-to-height ratio.

[QUOTE=Aleksandar;1249291]Well, making things looking real is a really difficult task.
But your problem is probably in wrong values for fovy and aspect.
Fovy for the human eye is about 41 degrees, and aspect depends on the window with-to-height ratio.[/QUOTE]

Thank you for the hint with the fovy of 41.

Still it doesnt look good. Do you have any experience how “large” one meter has to be in opengl to look “real”?

For example one could make 1 meter equal to 1 opengl-unit or 2 opengl-units. But this would make a big difference in how it looks.
I think the value of fovy depends on the ration meter to unit or am i wrong?

just choose a unit and stick with it. and yes 1 unit to 1 meter is usually a good idea. for the perfect fov measure your viewing distance to your monitor and take its physical size into account. then you should get “real” looking proportions/size ratios. to keep this ratio while moving your head you need head-tracking, which then also takes the angle at which you look at your monitor into account…

it all depends on your definition of “real”…

[QUOTE=Chris Lux;1249311]just choose a unit and stick with it. and yes 1 unit to 1 meter is usually a good idea. for the perfect fov measure your viewing distance to your monitor and take its physical size into account. then you should get “real” looking proportions/size ratios. to keep this ratio while moving your head you need head-tracking, which then also takes the angle at which you look at your monitor into account…

it all depends on your definition of “real”…[/QUOTE]

So i can calculate the fovy by the size of my monitor and the distance to it? I already guessed so but i was not sure because decreasing the monitor size would decrease the fovy and so create a “zoom-out” effect.

My definition of “real” is: Looking at the display and watching the scene should feel like looking at a real scene. So i model a really exisiting house and moving in it should feel like really moving in the exisiting house.

The other things with different viewing angles are already take into account by the hardware we are using.

[QUOTE=fp13__;1249319]So i can calculate the fovy by the size of my monitor and the distance to it?[/QUOTE]Theoretically yes, but don’t loose your time. You probably have some other problem.Maybe it would be useful to post some pictures to help us figure out what’s wrong.But, before that, please read chapter 3 from the Red book.Also, “OpenGL units” have no physical meaning. They could be nanometers but also light years. It is up to you how they’ll be interpreted. And it does not imply “realism” of the scene.Considering FOV, it should be calculated according to physical characteristics of the lens you want to emulate. FOV = 2.0 * atan( SensorDim / ( 2.0 * FocalLengthMin * Zoom ) )Human eye equivalent focal length is 48.24mm.You need to consider position of the viewer relative to the display, and display size and position if you want to incorporate your scene into some augmented reality system. I assume you have a very basic problem.