Are "modern" system OpenGL textures persistent?

Hello,

  1. If I create an OpenGL texture (using OpenGL 3.3+), can I be reasonably confident that it won’t get “lost” when Alt-tabbing, pressing Ctrl-Alt-Delete, window minimizing or any other similar operation? I am talking about a borderless fullscreen application.

  2. Without going into specifics, is there any danger with textures in a console application (XBOX/Nintendo Switch etc)? Like, when suspending the application, can OpenGL textures go poof?

  3. If that is the case, what is a good way to check if a texture exists after such an unfortunate incident? Is there an OpenGL function for this? I could always have a bitmap copy in RAM and if the texture is not valid any more, I could recreate it this way.

  4. I know that this is an OpenGL forum, but does anyone know what the situation is with modern DirectX? In older versions of DX, Direct3D surfaces did get lost when alt-tabbing or anything similar. Do modern DirectX textures suffer the same fate?

Thanks in advance.

Could you explain the background here? That is, where is this question coming from? I think that’d help the folks here give you the most useful response.

I can tell you I’ve been working on NVIDIA GPUs (and some AMD occasionally) on “desktop” PCs/GPUs on both Linux and Windows for 20+ years, and I’ve never experienced what you’re talking about.

The only time anything close to this has happened is when you blow past the full amount VRAM on the GPU and the driver starts (of course) swapping things off the GPU to try to make everything fit. Those textures aren’t “lost” though. They’ll be paged back onto the GPU if/when needed. Nowadays though, you have bindless textures where you can somewhat lock the texture in GPU accessible memory.

Your description above suggests desktop/PC, not mobile. So you’re probably not talking about that “context lost” stuff that I’ve only seen happen in EGL/OpenGL ES on mobile/embedded GPU drivers (Qualcomm Adreno drivers, I’m looking at you).

Per the standard, GL_CONTEXT_LOST should only occur if the video hardware is reset by the driver due to a crash. It shouldn’t occur just because of Alt-Tab or even Ctrl-Alt-Del. Although if the hardware is taking an excessive amount of time to process a command, Ctrl-Alt-Del may trigger a reset; I have a system where this is a common issue with games using Id Tech 5 (Rage engine).

I don’t know if the historical behaviour of destroying a context on window size change remains in modern versions of windows. The usual solution to that was to create two contexts which share data; resizing a window would destroy the attached context but you could just create a new context sharing data with the non-attached context to avoid having to re-upload data. You’d still have to restore the context state, including any container objects (FBOs, VAOs, transform feedback objects, program pipeline objects).

Interesting. I’ve never heard of this, nor personally seen this happen on Windows since starting to do OpenGL development on it (8 years ago). Window resizing at least in my experience is completely independent of GL context creation. No need for 2nd context tricks. Going back 20+ years, I’ve not seen this behavior with NVIDIA on Linux either.

Got a link to someone describing this behavior by chance? Sounds like someone trying to compensate for an old GPU vendor-specific driver bug decades ago.

It was fairly standard behaviour prior to XP (i.e. 95/98/ME and NT4/2K). So around 25 years ago.

I can’t find any references now, so it’s probably been fixed since.

That was a time when Alt-Tab on a game was more likely than not to crash it. Particularly on NT/2K as they would process the Alt-Tab immediately whereas 95/98 would wait for the next GetMessage (essentially pre-emptive multi-tasking versus cooperative multi-tasking), so NT/2K would change the video mode in the middle of rendering a frame.

This was true in D3D, but it was basically never true in OpenGL. Even in the late 90s, most OpenGL implementations that were bug-free enough to rely on wouldn’t straight-up lose stuff from doing an alt-tab.

But this also meant that GL implementations were more complicated and would enforce a lot of regular RAM usage for texture backups. D3D forced developers to manually do these things, simply informing them when the system lost all of their data.

To my knowledge, OpenGL is not available on those platforms, so your question is moot. You’ll have to look up the documentation for your hardware in question.

While D3D12 continues to recognize the possibility of a lost-context resulting in a lost of GPU storage, it doesn’t happen through normal activity like switching active processes. It usually happens when an application behaves pathologically with regard to the GPU (shaders run too long, etc) and the OS tells the GPU to stop them.