Texture compression issues

It’s quite easy to write a loader for .dds files, which can store textures compressed in s3tc format. There’s an example on nVidias site plus a great plugin for saving compressed textures directly from photoshop (it also works in paint shop pro). If anyone’s interested I also have some code for decoding s3tc textures into raw RGBA for compatibility with older cards.

Originally posted by schelter:
But even if the normal surface is smoothly varying, with only minimal curvature, each 4x4 block of the texture can only represent 4 discrete values, and they are linear and evenly spaced along the line in RGB565 space.

I wasn’t talking about the compressed normal maps here, just plain filtered ones (that’s why Moz “refocused” the discussion !).

Thanks for the info anyway !

Regards.

Eric

The problem is that the quality/compression ratio of S3TC is pretty poor. I use it for many textures where it’s acceptable – however, I still want to use JPEG in 10:1 compression mode for actually storing the textures on disk, as that gives smaller files and higher quality. Smaller files are really important to me.

For transparency, that’s stored as a separate bitmap; either as a grayscale JPEG (for 8 bits of transparency) or as a 1-bit PNG (for cut-out). Thus, actually loading a texture consists of:

  1. Decompress colormap JPEG
  2. (maybe) Decompress alpha JPEG or PNG
  3. (maybe) combine them into BGRA data
  4. tell driver to re-compress as S3TC

Note that I use the compression so that I can fit on a 32 MB card, and possibly to increase my texturing rate; I do NOT view hardware texture compression as a reasonable delivery format.

Yes, this is really slow, although we prefer to incrementally upload textures rather than take a big, brief frame-rate hit.

JPEG is block based. When will we see JPEG based hardware texture decompression? :slight_smile:

Please take into account that S3TC is not an easy format to compress into. The tradeoff between compression quality and compression performance can be brutal. So you can expect superior results with an offline compressor.

Quake 3 gets away with this because Q3 uses such small textures. (Q3 also misuses S3TC by using it for some textures where it shouldn’t have been used, such as lightmaps.)

Even though I understand the desire to get higher compression ratios for storage than S3TC can provide, I recommend you investigate alternative approaches. I’m sure you could roll your own lossless compression format that sits on top of S3TC that would accomplish your goal in a much more efficient way overall.

The fundamental design goal of schemes like S3TC is easy decompression.

You will be punishing yourself if your app compresses S3TC textures at runtime. A few 2Kx2K textures would probably already take a prohibitive time to load. Alternative approaches will give you both better quality and performance.

  • Matt

Originally posted by jwatte:
[b]Thus, actually loading a texture consists of:

  1. Decompress colormap JPEG
  2. (maybe) Decompress alpha JPEG or PNG
  3. (maybe) combine them into BGRA data
  4. tell driver to re-compress as S3TC

Note that I use the compression so that I can fit on a 32 MB card, and possibly to increase my texturing rate; I do NOT view hardware texture compression as a reasonable delivery format.
[/b]

I don’t see why DXT isn’t a reasonable delivery format for you. It doesn’t seem like quality is your issue. DXT1 gives you 6:1 compression over RGB8. Are you really that tight for space?

I’ll say it again. If you ask the driver to compress for you, you don’t really know what quality or performance you will get. Or that you will actually get any compression. It is perfectly legal for a driver to completely ignore your request for load-time compression. Even when the hardware can decode compressed textures

JPEG can introduce some nasty artifacts to textured primitves. Re-compresing a JPEG to DXT doesn’t seem like a good quality decision to me. YMMV

I wholeheartedly agree with Matt: Compress with your own tools, and examine the quality. Driver compression is useful during development, but don’t use it when you ship.

J

While on the subject of texture compression:
Why does DXT1 decompresses with less quality than DXT5, on images without actual alpha channel? (I am talking about nVidia boards). The spec says all the DXT formats treat the RGB part in the same way.
This is not a compression issue, but a DEcompression issue.
I tested the problem like this:
I define a user-memory RGBA8 image, give it over to the driver to compress into a DXT5 texture. It displays nicely. Then I read back the compressed data using glGetcompressedTexImageARB, remove the alpha blocks (first 8 bytes out of the 16 bytes of the 4x4 pixel block), and pass this data to a RGB_DXT1 texture. I expected the visual rendering result to be of the same quality.
It is not.
Any ideas?

It’s a hardware issue. This problem is finally fixed with GF4 according to the review at tomshardware.

Regarding distributing of textures I’ve found compressed .dds files to be much more suitable than everything else (for me at least). For normal maps though I’m usually using a grayscale heightmap in png format which I convert to a normalmap at load time.

Originally posted by Humus:
It’s a hardware issue. This problem is finally fixed with GF4 according to the review at tomshardware.

All the web references to this, (including Tom’s) describe it as a “16 bit color problem”. But all the DXTs are 16-bit for each of the (two) colors, and DXT5 is no different. But I hope he got the fact that its fixed in the GF4 right…

mcraighead, tell us they’ve improved the DXT1 format support on the GF4, so we can finally use it when we don’t need Alpha… Please tell us!!!

AFAIK, the GeForce DXT1 problem was due to the fact that the interpolation (= DXT decompression) results were done in 16bits with DXT1. NVIDIA said they were doing this by strictly following some DXTC spec, saying that DXT1 had to be used for 16bits.

Now, I hope mcraighead will tell us the real reasons but I doubt he will. (NVIDIA never really did)

I just hope that the GF4 really fixes the problem, and that it’s not the drivers forcing DXT1 to DXT3.

DXT1 quality is greatly improved on both GF4 MX and GF4 Ti.

  • Matt

> Even though I understand the desire to get
> higher compression ratios for storage than
> S3TC can provide, I recommend you
> investigate alternative approaches. I’m
> sure you could roll your own lossless
> compression format that sits on top of

Perhaps, but the investigation done so far hasn’t been able to beat JPEG when it comes to compression*quality factor.

> You will be punishing yourself if your app
> compresses S3TC textures at runtime. A few
> 2Kx2K textures would probably already take
> a prohibitive time to load. Alternative

Texturing over AGP is better? I don’t think so. Let me set the stage:

The application may run for days.

The application will see textures that didn’t even exist when it started, that need to be transmitted over a network.

The “network” is a 56 kBit modem (which means 30 kbit for all intents and purposes).

Everything the application draws, plus framebuffers, will not fit on a 32 MB card uncompressed.

As funneling in new textures is already slow, we’ve chosen to upload newly downloaded textures a little bit per frame, so that a 1024x1024 may take a second to fully upload. That’s OK, as it probably took longer than that to arrive over the modem anyway.

Meanwhile, the frame rate stays predictable, which is more important than fast loading of new textures for this application.