Very very odd memory texture memory heap usage on OSX

Something is going on that is making zero sense to me after over a week of investigation.

I have two versions of the same graphics engine. One uses Objective-C and a GUI alongside OpenGL 3.2 (core) on OSX. It fundamentally takes the exact same steps to load textures (leading to calls to glTexImage2D)

The second system uses SDL and C++ which also loads alot textures using glTexImage2D.

However, when loading the exact same scene files with the exact same textures, the Cocoa/Obj-C program takes up far far far less memory in the heap (roughly 150 mb compared to the c++ prog’s 1.2 GB). This is extremely concerning to me as the C++ program slowly grows the heap over time despite me properly deleting the textures when I’m done with them. I’d rather have the 150 mb behavior which I’m assuming leaves most of the textures in memory on the GPU

I meticulously checked the sdl_opengl init routines on mac over and over to make sure they line up with the context init that I run through in my cocoa app and they are damn near identical. I also made sure I’m not doing anything that would cause software fallback which would explain large heap allocations.

Anybody know what could be causing these two different behaviors?

It does sound like you have a leak on the C++ side. Run a memory debugging tool such as valgrind (Linux) or Dr. Memory (Windows) to help nail it down.

Question: Are you using garbage collection or Automatic Reference Counting (ARC)? If you have a leak on the C++ side, this could be masking it on the Obj-C side.

A few other thoughts: Are you only calling glTexImage2D once per texture? Are you deleting (or reusing) the memory block you are providing to glTexImage2D? Are you having the GL driver compress your textures for you (i.e. providing uncompressed data but requesting a compressed texture format)? Are you having GL generate MIPmaps for you?

It’s definitely not from malloc or new as my instruments (profiler) says most of the memory is in a special section called “gfx texture level”. ARC is on and I’m very careful about memory usage in objc. C++ is using shared pointers and I’m equally as careful. I’m doing “something” to cause the context to cache all texture memory on the host side. That or the cocoa app is lying about it’s memory usage for some reason. Gl generates mipmaps but that is done on both sides.

edit: still taking up a GB of memory compared to 100 mb on cocoa. Despite trying GL_UNSIGNED_SHORT_5_6_5

My cocoa app was doing
glTexImage2D(_type, 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_SHORT_5_6_5, data);

and obviously GL_UNSIGNED_SHORT_5_6_5 is going to take up less memory than UNSIGNED_BYTE

So it looks like there will be no magic bullet to fix the heap growth on the production game side.

FINALLY figured this out. Posting here for other unlucky bastards. This was a bug in the OSX OpenGL driver (big surprise, apple’s year of awful software releases continues)

Thank god for the google chromium people posting this 374677 - chromium - An open-source project to help move the web forward. - Monorail

The solution. Apply mipmap texture filters BEFORE glTexImage2D and the call to glGenerateMipmaps