Weird texture data in compatibility mode

Hey guys,

I’m developing a plugin for some 3rd party application.
The render callback in plugin is called where I have access to an active OpenGL context.
The program works in OpenGL compatibility mode and on macOS I have some weird issues.

The currently bound texture is the current image frame the host application and I need it for the further processing.
The problem is that on macOS it looks like this:


On the left is the original image in the host application and on the right it’s my plugin rendering a quad with the currently bound texture.
Any ideas why it looks like this? Does it suggest something I might be missing? It kind of looks like the color values are clipped multiple times.
Everything works OK on windows.

In the screenshot you can see the fetched texture details (size and the internal format, which is RGBA8).

Any ideas?

Are you sure that it’s related to the texture? What happens if you disable texturing and just render solid white?

Yes, I’m sure.
Just for a test I checked other host-allocated textures - going from 1 and up - all of them rendered fine. This particular one looks different.
For a moment I thought that maybe it’s some compressed internal format… but it should render just fine. And after querying the format it turns out it’s RGBA8.
When I’m trying to copy the pixels to some local buffer, it looks the same.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.