Help, I’m getting crazy with this problem :
I have a framebuffer+texture sized to the window size.
I get 60fps (vsync) as long as the window is small.
Then if I resize the window over a certain size, the framerate suddenly goes from 60fps to 5fps ! It’s not a progressive slowdown, it just really kill the perf over a certain framebuffer size (like over 1400x600). The problem is a bit random, and it happens more often when I have 2 windows (with each his sized framebuffer) sharing the same OpenGL context.
Any idea ? I’m getting mad.
I get some random GL_OUT_OF_MEMORY errors, but I’ve checked with GPU-Z, only 200MB of VRAM is used (and only 40MB for my application !), which match the size of my textures.
System: Win7x64 NVIDIA GeForce GTX260 1.7GB driver 266.58
If you got GL_OUT_OF_MEMORY, then you ran out of memory. Trust GL errors instead of your calculations which certainly don’t include many hidden things.
Instead of using GPU-Z, try to use NVX_gpu_memory_info in order to determine real memory allocation. But even that can be irrelevant, because the driver may fail to perform certain action and reallocate/free used memory. So, it is possible to have a pick in memory allocation which can be filtered out by sampling period in GPU-Z, or in your code.
I’m quite sure I have no memory leak :
-I use very few textures, and I triple checked each is created and destroyed properly.
-I checked with NVX_gpu_memory_info and it says I still have 1.5Gb memory free to play with, so the GPU memory is definitely not full.
I really think something’s fucked up, but not because VRAM is full. And hell, you do no fill 1.7GB memory at the very start of your program and 4-5 textures loaded
Okay… I thing I have something.
I started a fresh session this morning, and there was no bug anymore, no matter how hard I try, it worked perfectly.
Then I ran TV (Windows Media Center) for the news, and BAM the bug strike back. The previous days I also had TV running in parallel while working.
So this has something to do with either Windows Media Center, GPU video decoding or the way the video driver handle that.
We’ve had a couple of customer complains that our application was periodically hanging for several seconds, and it turned out that they had a web browser with hardware-accelerated flash video running (a video tutorial). So I’m tempted to say the GPU video decoding is doing something odd in the hardware (perhaps reserving a ton of memory?)