I am working on a project for plotting 3D point cloud data which can consist of several million points. (This 3D contour data taken using a laser for anyone who is curious). I have my main display region for viewing, rotating, panning and zooming the data. I also have a small mapping window (about 100x100 pixels) that is suppose to show the area being viewed by the user. The mapping window is a very low-res 2D view of the entire data area as seen from above, and its purpose is to show a red rectangle which indicates which area of the map the viewer is now looking at. You could think of this as equivalent to a 2D map of the US with a red rectangle around the Chicago area when the user is viewing Chicago in the larger 3D viewing area.
I hope that makes sense. My problem is finding a way to determine from Open GL what the current (x,y,z) limits are within the view window so I can display the rectangle. In order words, if the data runs from 1-1000 in x and y, but the user is zoomed in and is only seeing 100-200 in x, and 350-450 in y, that is what I need to find out.
I’ve tried two approaches. The first was to use gluUnProject and call it four times giving it the four coordiates of the corners of the screen, but that doesn’t seem to work.
My second approach was to break up the data into pieces, assign each piece a number and then go into SELECT mode and create a name list. Since the data is already divided into frames, that isn’t a big deal. The return from the select buffer gives me a list of which quadrants were displayed, and from that I can update my mapping region. This works but seems like a kludge, as well as slowing things down.
I can make up for the speed problems by creating a bounding box, but before I go to that trouble, I suspect I am probably doing this the hard way and there may be a simple solution. Any ideas?
Thank you for your help.