Fragment Programs and Compressed Textures


Has anyone tried using them both in ATI cards ?
In my system the compressed textures get corrupted in the second i enable a fragment program…, anyone had this experience ?

I’m using Catalyst 6.14.10


Never heard of any such program, and it works just fine for me.

I’m not talking about a specific program., all i need to do is just enable fragment programs without even bing anything, and the compressed textures mess up.
If the textures at loading time are created with
after enabling fragment programs, they get messed up.

Explain “messed up”.
Uncompressed textures work okay?

That’s what i mean messed up…, if i don’t compress them, they will look ok.

If you don’t use fragment programs, does it work OK? If so, it’s a driver problem. However, if it still doesn’t work, you’re probably uploading the texture incorrectly.

Originally posted by Bruno:
I’m not talking about a specific program.
Sorry, late night spelling … should be “problem”.

Anyway, nice screenshot. :slight_smile: I got a similar bug once, but it was quite some time ago. The hardware seems to be reading the memory at the wrong address, alternatively interpreting it as another format. In any case, looks more like a driver bug than anything else. Though there’s probably more to it than just fragment programs and compressed textures in general, cause I got that working in all my apps on the same driver, so I guess we’re doing something different.

Anyway, could you please send your application with source (or just make a small repro case if you don’t want to send the full app) to me at epersson ‘at’, and I’ll take a look at it.

Yeah, it looks ok either if i don’t use fragment programs, or if i use fragment programs with uncompressed textures.,

Ok humus, give me some time, i’l make a seperate application.

We had some problems with compression on ATi cards. The texture was dirty (real dirty), but not messed up as you show.

We worked this around by setting driver quality to “best” (if not textures were compressed, even if we asked not to). Our actual problem is that we wanted some textures to be compressed, and some others not to be.

Depending on the hardware and on the driver version, problem was solved.

It just points out that they override compression format to get better perfs at high resolutions. At least that they used to.



When you were uploading the textures did you give the driver any options when choosing internal formats of the texture? A while ago on ATI cards if you specified RGBA/RGB as the internal format it would often compress it to use a 16 bit format. (this is legit according to the OpenGL specs) Only when you requested RGBA8/RGB8 for the internal format did you get a 32 bit format.

This differed from Nvidia which seemed to always use 32 bit textures unless you requested a 16 bit texture. (this is also legit)

On this project, we specified RGB8/RGBA8
And yet, we had the same result when we asked for compression, than when we didn’t.
It was a lot visible because it was used on detail texture, and even more because the texture output was scaled by 2.0


Regarding the original poster’s problem. I verified it today. It’s a bit esoteric combations of things that somehow messes things up. For it to appear you have to first use 3 as internal format, then reupload a texture to the same texture object with a compressed format, and use fragment programs.

I (often) get similar texture artefacts for a cube map texture when I switch my application to (windowed) full screen mode (no change of resolution etc.) on an “ATI mobility 9600 pro turbo” equiped notebook; no fragment programs, only fixed function pipeline… :frowning:

Hampel, do you have a sample app of that? If so, send it to me at epersson ‘at’ and I’ll take a look at it.