GL_S3_s3tc

Hi there !

I wondered if any of you had the specifications for the GL_S3_s3tc extension.
I know they were on the S3 site long time ago but since S3 has been bought by Diamond, it seems the document has disappeared…

I can’t find those informations anywhere…

Anyone got a clue ???

Regards.

Eric

Umm, try again. Diamond was bought out by S3. You should try searching www.s3.com thoroughly.

Siwko

ADDENDUM - I just looked at the S3 site. You’re right, the S3TC stuff is gone. My suggestions, email the tech staff at NVIDIA about it (They’re supporting S3TC now, so they’ll be able to help), and/or check their website.

Cheers!

[This message has been edited by Siwko (edited 05-18-2000).]

You might try looking at VIA’s site, since they recently bought s3’s graphics division.

I think this extension is not very standard.
(probably, other hw vendors don’t like it )

Look like there is a new one:

GL_EXT_texture_compression_s3tc

[This message has been edited by Serge K (edited 05-18-2000).]

Well, I have tried all of that !

Concerning nVidia, they support GL_S3_s3tc only (well, for my Detonator 5.14 NT 4.0). Anyway, those drivers are beta and I do not think nVidia would answer this one (well, have you EVER had an answer when posting to nVidia ?!?!?!?!).

Thanks anyway, and I’ll come back here if I can find any information.

Regards.

Eric

Nvidia’s tech docs on the GeForce2 GTS specifies that it supports all 5 DirectX Texture Compression Modes (DXT1-DXT5), at it also states that DXT1 IS the same as S3TC. So… with that in mind, Nvidia has to be able to give some sort of referece to S3TC somewhere.

I just haven’t found it.

I HAVE an S3 card, so if you write some code and it doesn’t work on your card, maybe you could send it to me and I’ll see if your drivers have a bad implementation or something. BTW, the Voodoo4/5 supports S3TC, too…

Thanks Kaeto but at the moment, I do not have the specs of the extension to use it !

I guess it is some kind of a different texture type when using glTexImage2D (something like GL_S3TC instead of GL_RGB) but I do not know the value of GL_S3TC…

I might try to hack the drivers and see against which values the texture type is tested… (the one I do not know will be GL_S3TC… well, there might be more than one !).

I think I’ll wait until these specs are officially released or until nVidia implements the documented extension in its drivers (by the way, has someone of you tried Detonator 5.22 ? Has it got this extension ? I can’t test it coz’ it’s only for Win9x…).

Regards.

Eric

On NVidia’s website, in whitepaper section there is an article, and also source code, on how to use S3TC under GL…

Well, I have read the article about S3TC on nVidia site (actually, I got in touch with the author for some questions).

The problem is that my drivers (Detonator 5.18 NT) do not have the extensions he describes for using S3TC (the only one I have is GL_S3_s3tc) : I will have to wait for the next release of the NT Detonator…

Thanks a lot everybody !

Eric

Originally posted by Eric:
[b]Well, I have read the article about S3TC on nVidia site (actually, I got in touch with the author for some questions).

The problem is that my drivers (Detonator 5.18 NT) do not have the extensions he describes for using S3TC (the only one I have is GL_S3_s3tc) : I will have to wait for the next release of the NT Detonator…

Thanks a lot everybody !

Eric[/b]

Use one of the following defines for internal-format-parameter of glTexImage2D():

#define GL_RGB_S3TC 0x83A0
#define GL_RGB4_S3TC 0x83A1
#define GL_RGBA_S3TC 0x83A2
#define GL_RGBA4_S3TC 0x83A3

These should work for the GL_S3_s3tc extension you already have!

Kosta

I’ve never understood the texture compression extensions, so maybe someone could clear this up for me? I mean, wouldn’t compression/decompression cause a performance hit (to compress and decompress) for the sake of saving texture memory? Anyone have any benchmarks on using compressed textures vs. normal textures?

Thanks Kosta, I’ll have a go with that.

Actually, the 5.22 for NT should be released any time soon so perhaps I won’t have time for that ! But thanks anyway !!!

To fenris : the decompression being done by the hardware, we can just hope that there will be no performance hit. Moreover, is it better to lose time decompressing or to lose time uploading textures for each frame ?!?! I do not have the answer but I’d go for the first choice !

Well, perhaps I say that because I am running a Geforce256 and planning to jump to a Geforce2GTS…

Regards !

Eric

Actually S3TC will give a performance INCREASE. It’s quite simple to decode, and at least on S3 cards there’s a dedicated decoder for it. The performance increase comes from the fact that less time is spent reading the texture (from main memory OR video card memory)

The S3TC compression has much in common with palettized textures, the difference is that you split the texture into 4x4 blocks and each store a number of colors (4 i think) and uses a couple of bits (2 if 4 colors) per pixel to index between the colors. This is easily implemented in hardware and speed is no problem (it’s not like decoding a jpeg). With memory bandwidth getting more and more of a problem this will be a very important feature as it reduces texture memory bandwidth needs to 1/4 (or gives you 4x as large texture at the same amount of memory) thus increasing the performance.

Yeah, S3TC stores 2 16 bit colors, and interpolates 2 more in between those, and then it stores four 2 bit numbers

Yeah, i forgot about that it interpolated between the color (long time since i read about it) … couldn’t get it on a appropriate number of bits when i counted on it, but with 2 colors interpolated it gets 64 bits / block . Thanks for clarifying.

Just as an aside, when I first heard that explained I was like, Dang, why didn’t I think of that It’s so simple, elegant and effective

Have any of you bothered to check out a game that uses s3tc compression, at least when i use detonator 5.22 and my geforce ddr it looks like crap in quake3, wrong colors and blocky appearence.

In stead of using compression you shold just use smaller textures.

That’s cuz your drivers are messed. On my Savage2000 S3TC doesn’t produce a noticable decrease in image quality. (I can force it on, and have played every game I have with it on and off, and seen no problems because of it)