I have an image that is some where near 7500 X 6000 and i need to load it as a texture. I am aware that no card can handle a texture this big. I have explored texture tiling options and I have implemented some to see if they work. None of them seem to do it in real time. I would like to know if anyone has been successful in loading images of this size in real time as textures or I would like to have some advice as to how I should get around this problem. I am starting to think that it is physically impossible with todays current computers and cards. There are two other things that also need to be taken into consideration:
- Eventually I will need to load about 40 of these 50mb images at once.
- editing(scaling, tiling) these in photoshop or another editor is not an option. -thanks
[This message has been edited by mdog1234 (edited 10-03-2003).]
Try taking a look at gluScaleImage.
So you have 2GB of textures you want to display concurrently? Might I suggest re-evaluating your design to confirm there really is no way of relaxing these requirements?
There aren’t any graphics cards today that have that much RAM on board and there won’t be any for quite some time. Even if you can tile them, you’ll have to store a large proportion of the textures in system memory, and copy them across the AGP bridge every frame. This will not be a performant system.
I’d be fascinated to know what you’re writing that has such high requirements!
Yes I thought it would be impossible. Keep in mind I’m working with people who don’t know a lot about computer graphics or computers in general. I work for NCSA and I am creating a program for geologists to view their data in 3D. Basically they collect data and in return they get a model of the land and the layers in that area. They can manipulate it or whatever. It’s basically a tool for geologists. I am using openGl to do the rendering. One of the other things that the program is supposed to do is load HI-res satellite photos of the area and be able basically paste them over the land mass. Hence the 50 mb images. I pretty much knew that what I am trying to do is impossible but I thought I would ask others to make sure.
[This message has been edited by mdog1234 (edited 10-04-2003).]
That sounds an interesting project!
If they’re viewing the whole map at once there’ll be a huge drop in resolution anyway to fit on a monitor. You may be able to pre-process the data into more manageable resolutions.
If they’re zooming right in they’ll only be looking at small sections of the high-res maps at any one time. You may be able to cut them up into a number of smaller blocks.
The best system would be a conjunction of the two. When they zoom out you’ll need a full-frame low-resolution image, but as they zoom in you can gradually shift to a cut-down high-resolution image. You’d need to be clever about when you jump resolutions and size, and design a data structure to store all your image information intelligently to allow you to traverse it painlessly.
It’s an interesting problem to be sure. Good luck, and let us know your final solution!
Thanks, I will let you know
Few things are impossible. You “just” have to develop a system that is as flexible with resources as possible. Make the program tuneable for tile sizes, panning speed, viewable area etc. Obvious resources to check for: video card/mem, system bus speed,system ram,cpu speed.
It is of course a real challenge to make the program work on all systems or at least on those who meet the minimum specs. These suggestions might seem obvious but the problem doesn’t have a simple solution.