Hello
I am writting a simple CAD viewer with OpenGL
It draw the graphics objects in a “perspective” scene.
How do I calculate the current map-scale (1:1.000, 1:2.000, …) of the view ?
Where can I read information?
Thanks in avance
Hello
I am writting a simple CAD viewer with OpenGL
It draw the graphics objects in a “perspective” scene.
How do I calculate the current map-scale (1:1.000, 1:2.000, …) of the view ?
Where can I read information?
Thanks in avance
That would be totally up to how you design your app. There is no one set scale in OpenGL. There are a couple ways I can see you defining a scale.
You specify a scale up front and then draw your models accordingly. For instance 1 unit length in the display = 1 foot real in life measurement. Then to display a 2 foot line you could just draw it from say -1,0,0 to 1,0,0.
You have a model of known real-life size, and you want to calculate the scale it’s displayed at. Say the model represents a real life object that is 2 feet long and the distance between the end vertices is 1. Simple division will give you the scale.
The above methods will give you a scale with regard to the coordinate length to real length. If what you really want is the scale of actual screen display length to real life length you have a lot more to worry about, some of which would be difficult to get using software. If that’s the case, you need to consider size of the monitor, current resolution, and then the OpenGL elements like projection matrix, viewport, etc. It could get ugly.
You specify a scale up front and then draw your models accordingly. For instance 1 unit length in the display = 1 foot real in life measurement. Then to display a 2 foot line you could just draw it from say -1,0,0 to 1,0,0.
You have a model of known real-life size, and you want to calculate the scale it’s displayed at. Say the model represents a real life object that is 2 feet long and the distance between the end vertices is 1. Simple division will give you the scale.
The above methods will give you a scale with regard to the coordinate length to real length. If what you really want is the scale of actual screen display length to real life length you have a lot more to worry about, some of which would be difficult to get using software. If that’s the case, you need to consider size of the monitor, current resolution, and then the OpenGL elements like projection matrix, viewport, etc. It could get ugly. [/b][/QUOTE]
Hi,
I think that I really want is the scale of actual screen display length to real life length how you said.
If I use “gluProject” in order to get the xy-window values of two 3DPoints of the zFar plane in real-life coordinate values, and I calculate the ratio xy-window length / real length, Is It correct??
very thanks your reply.
You could do that to get the number of pixels between two points, but the problem would then lie in calculating the pixel length into real-life units. If you only need the app to run on one computer, you could just plug in numbers for the size of your monitor, display mode, etc.
If you wanted it to be more flexible you could provide some sort of calibration mechanism for people, such as showing a ruler, and providing ways to change the size of the ruler so that people could compare that to a real-life ruler in order to get the proper measurements.
One problem with doing the measurement just on pixels returned by gluProject would be that if you turn your object so that the length isn’t aligned with the axis you measure on, then you need to take depth into account as well. If you used a calibration method like I suggested above, you could just correlate coordinate lengths to real-life lengths, and not have to worry about lengths in pixels. Then you run into the problem of the size changing if you resize the window… you could either not allow resizing of the window, or make sure that when you do, you setup your projection matrix so that the objects don’t then resize.