RIVA128 & alpha channel

Hi!

I still have an Viper330 with RIVA128 chip, and this chip doesn’t have automatic transparancy on textures. This means that when I use a texture that has a “light round” in the middle and black all around, the black stays black and overwrites the picture in the background. I have to create an alpha channel myself and only then will the card display this right.
Newer cards don’t have this, so a lot of games don’t display right on my card.

Now for the question:
Is there any way I can lookup if the video cards does or does not support this feature, and when it doesn’t, only then I have to create the alpha channel myself, saving memory for people who have automatic transparancy.

thanks,
John

There is no such a thing as “automatic transparancy”.

If you want to use ColorKey transparency - forget about OpenGL and work with Direct3D.
(btw, new Direct3D documentation don’t recommend ColorKey transparancy, and alpha channel transparancy is preferred now)

> Is there any way I can lookup if the video cards does or does not support this feature, and when it doesn’t, only then I have to create the alpha channel myself, saving memory for people who have automatic transparancy.

How can you save memory by doing this?
If you have 16bit texture - you convert it into 16bit ARGB 1:5:5:5.
If you have 24bit texture - you convert it into 32bit ARGB 8:8:8:8, but most of 3D cards cannot use 24bit textures and convert 24bit into 32bit anyway.

Ok, maybe I took the wrong words with automatic transparancy, but I am not talking about DX. I choose for OpenGL because it’s OS independant, but lets not make this an OpenGL/DX flame war…

When I want to correctly use textures that have transparent peices, I have to create an alpha channel and say to OpenGL I have 4 items per pixel instead of 3 and use GL_RGBA instead of GL_RGB.
When you have a newer card, the 3 items (RGB) will suffice and the cards makes the texture transparent for you, so you don’t have to create an alpha channel yourself.
This will save memory since either I load an 8:8:8 texture with rgb, or 8:8:8:8 with rgba values. (I do understand this correctly, don’t I?)

Now, is there anyway I can detect if a card can make these textures transparent by itself (that’s why I call it automatic transparancy) or not?

John

[This message has been edited by Sjonny (edited 04-28-2000).]

> When you have a newer card, the 3 items (RGB) will suffice and the cards makes the texture transparent for you, so you don’t have to create an alpha channel yourself.

->

Can you explain, how exacly these newer cards manage to determine which pixels are transparent?

Direct3D has support for ColorKey transparency. One special pixel value may be used as ColorKey - all texture pixels with such value are considered transparent. This method was very useful for software rendering - blending was very slow. Also it can be used for color index textures. But for RGB it’s not flexible (and bad for texture filtering).

So now textures with an alpha channel are preferable.
(and OpenGL has no ColorKey support)

Can you explain, how exacly these newer cards manage to determine which pixels are transparent?

No, I can’t. because if I knew how they do it, I could probably simulate it better.
What happens now is that the picture doesn’t blend more or less when something is changed.
I’ll give you example:
I downloaded the disasteroids game, and triend to fix the code. What happens on a G200 is the logo that “fades”. On my riva128, is does not and the black in the bmp overwrites the space picture.
When I create my own alpha channel, the blackness in the picture is gone on the riva128, (can’t remember if it fades correctly) but the G200 doesn’t fade anymore.
So creating an alpha channel based on the bitmap data doesn’t work well.
A=(R+G+B)==0?0:255;
or
A = (R+G+B)/3;
it all doesn’t matter how I create the alpha, it all gives the same (wrong) result.

anybody else has/understands this problem?

John

If i understand it right what effect you’re trying to achieve i would suggest that you use glBlend(GL_ONE,GL_ONE) with and RGB texture.

I’ll give you example:
I downloaded the disasteroids game, and triend to fix the code. What happens on a G200 is the logo that “fades”. On my riva128, is does not and the black in the bmp overwrites the space picture.

NOW I can understand your question.

It happens, because the chip has the limited set of blending operations. If some application wishes to use the operation which is not supported is hardware, theoretically the driver should use (slow) software rendering. Instead of it the driver uses some “similar”, but wrong method, supported in hardware.
Actually, it is cheating.

btw, if you want to know who is the champion of cheaters - it’s Permedia2. The drivers for this chip with “excellent OpenGL support” ignores all unsupported blending and texture environment modes

What you are xperiencing is a known bug with the Riva128s. It’s the problem that they just do not support some of the blend modes. I dont recall which ones dont work, but I remember I had the same problem with my old 128.