3D Textures

The newest NVidia OpenGL extension specs say, that the GeForce3 supports 3D textures in hardware. If i use 3D textures they run very slow in software (12.41 drivers).

Is there any way to activate hardware accerlated 3D textures ?
Which driver release is going to support 3D textures in hardware ?

According to Cass Everitt, who is one of the nVIDIA guys who spend a lot of time helping people on these discussion forums (along with Matt McCraighead ), nVIDIA should issue an official statement about this.

The official word we have had up to now is that GF3 does not support 3D textures in hardware (although John Carmack said it did in one of his interviews/plan file).

I have tried to contact several people at nVIDIA (both in the USA and in the UK) regarding this problem but none of them can answer.

I guess we have to wait for this “official statement”.

By the way, Cass: didn’t they forget about that ???



P.S.: from a personal point of view, I thought GF3 (NV20) would not support 3D textures in HW but that NV25 (the upcoming !) would.

In my opinion GeForce3 has 3D texture support built in, but they currently do not support them in hardware because they are not important for current computer games. Maybe they really want to retain them for the x-box.

Another thing that is not clear is, that the OpenGL extension specs mark texture_shader2 and vtc_compression extensions as not supported at all.

How does that make sense ?

If you do a search on this forum, there are at least 2 other lengthy threads on this topic.

Unfortunately, we’ve had no final official word on it yet, so everything is just hearsay and speculation.

– Zeno

I was looking for a GF3, and this is what I found on this page http://www.teppro.com/impact5-2.htm
(That board is based on a GF3 GPU)

Looking at the features, I was very surprised when I saw this:

· Full hardware support for DirectX 8.0, D3D and OpenGL 1.2
· Volume textures support

Full hardware support for D3D (including 3D textures) and OpenGL 1.2? Volume textures support? What happens if I buy that board and then that’s not true (this applies for all GF3 boards of course, since all GF3 have the same 3D capabilities)

I think that the GF3 is capable of hw 3D textures (if not, I don’t understand why there are so much confusion) I hope they’ll enable that capability soon…

  • Royconejo.

[This message has been edited by royconejo (edited 06-11-2001).]

As for VTC and Texture_Shader_2, both of them require 3D Textures. If the implementation doesn’t support 3D Textures, having those other two extensions isn’t very useful.

Much like NV10 supported register combiners before the NV15 drivers came out, NV20 likely supports 3D textures. That support will, in my opinion, continue to be hidden until X-Box launches.

Yes, right … both require 3D textures.

But 3D textures are marked as supported in hardware by the GeForce3 chip.
So why are these two useful extensions not supported at all ?
Why does NVidia propose new extensions and doesn’t support them ?
That doesn’t make sense, does it ?

just for note it in every thread bout 3dtextures, i can run them in software on my gf2… and they look nice ( although… 2fpm… )… its not the EXT wich is supported but gl1.2, where it is standart… now i just think why they would implement a softwareemulator for something they cant use? i mean, textureshaders are in software, too, vertexprogram, registercombiners up to 8 combinations, etc…

cant wait for the official statement…

What the hell has 3D textures on a PC accelerator card got to do, with the release of the Xbox? People who want 3D textures the most on GF3’s are probably not bothered that the Xbox wants to have them first. I can’t see the logic there, if thats true. I honestly can’t believe nvidia would not expose 3d textures in opengl, just cos M$ wants hardware textures in Xbox 1st. Other PC cards have 3D textures already.

What is more likely, is that perhaps the hardware was taken out, and they’re trying to figure out a fast workable solution using the remaining silicon of the GF3 chip, now they realise loads of people want 3D textures.


I have a feeling that Matt and Cass are sitting laughing at us mere mortals…

Perhaps they are covering upsome major mistake made when trying to get the NV20 out on time. I read a statement somewhere which I was sure was an official statement suying that they ran out of time to fully implement 3d textures with proper harware support and all the extras and so they are waiting for a future release like the NV25 before it fully supported

I figure nVidia is simply providing payment for services rendered. Keeping out 3D textures until X-Box (which already exposes them) arrives ensures that no PC games (using nVidia’s hardware, at least) support 3D textures until a few months after X-Box launches. This makes the feature set in X-Box more advanced than most PC games, for a little while at least. Microsoft obviously benifits from this.

What services did Microsoft render for this payment? How about shafting every non-nVidia card developer on DirectX 8. After all, the vertex shader and pixel shader specs are written based on nVidia hardware (except for register combiners). In order to implement them, one either has to emulate them (low performance), re-work their hardware to conform to nVidia’s specs, or be nVidia. Card producers who can’t implement DX 8 fully will be left behind, even though their cards may have similar enough features.

There are just too many quirks in pixel shaders, like the entire texture setup stage which mimics the texture shader specification almost exactly, right down to the smallest quirk of nVidia hardware. Other developers have to somehow port their hardware into nVidia’s hardware, which is always going to induce some inefficiency.

I think you’ll find that’s not exactly true. In designing the PS1.0 instruction set I believe MS spoke to all relevant card manufacturers and even made allowances for limitations of certain peoples hardware that are no longer with us (PS1.1 makes things slightly better). And if this was the case you don’t you think the whole functionality of register combiners would be exposed in the shader language? Certainly I believe nVidia did have a substantial say in the spec but that’s fair considering the direction they are heading. I don’t think ATI are having problems producing DX8 capable hardware either…

I do hope that by some form of magic nVidia get 3d textures to work on the GF3 but I’d be amazed if the reason they don’t work at the moment is because MS asked them to hold it back (Don’t you think that the fact the XBox is in a different market and it’s graphic chip can push loads more triangles, it’s UMA etc… is enough to differantiate it?) - more likely due to time constraints a problem occured when 3d textures were activated and nVidia chose rightly not to expose them as they weren’t 100% right and chose to wait and see if they could come up with a work around.

Anyway this is getting slightly OBT…

I’ve just heard something - I think soon you will be happy people with your spangly GF3’s…

This appeared this morning in the DX List:


>> 1. My GeForce 3 does not have Volume Texture support? I
>> thought it did.
>It does not. My naive and/or badnatured assumption is that
>they didn’t put this in to make sure this ATI feature /
>selling point stays marginalized due to inadequate support.
>If both major vendors would support this feature,
>developers would take it more seriously. We then would have
>games taking advantage of it, and more Radeons would be
>sold. I don’t know if its the same deal with N-Patches
>(TruForm), which IMO is a killer tech due to ease of use.
>ATI may have secured exclusive rights to that or something
> – maybe somebody could shed some light on this?

Ahh, good, a conspiracy theory… :slight_smile:

But, as usual, life’s simpler than that.

The GeForce3 does support volume textures (aka 3D textures). The current
driver does not. We did not add support to the recent release for the
simple reason that we were more interested in getting the drivers released
than in getting the fullest support possible for what are presently marginal

We will add support for volume textures in a future driver, and that support
will cover the GeForce3’s that have already shipped and for all it’s
derivatives in the future. It will also include support for volume

As for N-Patches, NVIDIA’s position is that these are a feature which really
doesn’t fit well into the current technological thrust (as well as having a
few quite specific technical problems). They’re something of a dead-end and
we don’t plan to support them.

The decision about what you support in your game is up to you. We just want
to make sure that you understand our position when you make your decision.

>> 2. On my GeForce 3 i get an INFO : Driver failed to create
>> index buffer in our particle system when I call DIP. But
>> the DIP call succeedes.
>If you search the archives, this was answered earlier as an
>indication of geForce HW not putting index lists into video
>memory (which they traditionally don’t do), or something
>like that. So its not a problem. I am surprised this gets
>logged out at all.

Yes, it’s harmless, ignore it.


Richard “7 of 5” Huddy
Developer Relations, NVIDIA Corporation.

That’s funny because I contacted Richard Huddy two weeks ago about this and he never came back to me (he said he had to ask people in the US)… I thought he was not allowed to comment on this. I guess they have decided to release the information, then !

Well, that’s good news anyway !



Hope that’s going to happen soon. And hopefully texture_shaders2 will also be supported.

How about access to the newest beta drivers from NVidia with 3D texture support for us developers. Since the restructuring of the NVidia’s registered developer site, my login is disabled and nobody at NVidia answers my emails.

I am going to present new volume rendering approach at this years Siggrapg/Eurographics Graphics Hardware Workshop in LA (http://www.graphicshardware.org). 3D textures and texture_shaders2 would make this approach more valuable: http://wwwvis.informatik.uni-stuttgart.de/~engel/pre-integrated/

Powerangel, there is nothing new in the drivers section of the Registered Developer web site (just in case you were wondering).

I do not fully understand why nVIDIA decided to close the whole thing and then re-authorize each access one by one but I doubt this is really due to leaked drivers (as they said): although I have been registered for some time, I have almost always found newer leaked drivers on the net than on the RD web site: as far as I understand, the leaks do not come from the RD web site but from OEM themselves…

Anyway, if the purpose of changing the RD web site was to make REAL beta drivers accessible to developers (i.e. with HW 3D Textures support :slight_smile: !), that would be REALLY good news !

As for nobody answering your e-mails at nVIDIA, I suppose it is because they have received tons of them… It will probably take time to reactivate the accounts.



Quote : “As for N-Patches, NVIDIA’s position is that these are a feature which really
doesn’t fit well into the current technological thrust (as well as having a
few quite specific technical problems). They’re something of a dead-end and
we don’t plan to support them.”

I find this really funny !
And it’s not the first time this kind of words appear on these forums.
Every feature not FIRST implemented by NVidia is considered by the same NVidia as marginal or useless !
I agree this is normal marketing war, but has NO PLACE on the OpenGL developper board…
The customer has not to be careful and not fall into dirty business tricks.
3D textures for per pixel lighting simply rock. N-Patches require almost no work from the developper and don’t affect performance in a significant way. Other example : EMBM. It’s easier and faster to use the ATI extension than to do all this stuff with the combiners/shaders.
The only really usefull thing for us programmers would be the manufacturers to work closer together and get us devs more vendor independant extensions.
My 1/100 cent

This is another reply from Richard that can help you:

>I really like conspiracy theories - did you have to dispell
>that one so soon?
>I’m really confused about the GF3-volume texture issue as
>I’ve been told flat out by many people from nVidia (possibly
>even yourself at the Gathering) that they were not
>supported, and to wait for a future card.

Agreed. At the time of The Gathering we didn’t expect to be able to enable
volume textures for GF3 in any reasonable timeframe. You can see from how
long it’s taken us to get our current DX8 driver out there that this isn’t a
totally unreasonable position.

Sometimes the things you’ll hear from us will seem to contradict something
which we’ve said at an earlier date. To a certain extent “that’s life”.
All we can do is give you the best information which is available at the
time - and things like this can change. The volume texture issue is one of
those. NVIDIA’s decisions can change in the light of new information that
we get about our chip yields, driver team load, market pressure, developer
demand etc.

>At the end of the day though I’ll just be happy for it to work
>- but I am worried now that there may be issues when they are
>used - like a texture stage disappears or what-not???

Don’t worry. A volume texture consumes a single texture stage. End of

>As for TruForm you have to admit it does look good from a
>first glance and could you say why exactly you see this as
>a ‘dead-end’? Do you see subdivision type surfaces in
>graphics cards as a no-go generally?

Well, we actually see subdivision surfaces as the main thing to aim for. I
can’t go into details about how long we expect it to take, but it’s a
serious goal of ours for consumer graphics. You can tell that from the
amount of research papers which we publish from the likes of Henry Moreton
and Dave Kirk.

This is a much more useful line of development.

And for subdivision surfaces to be really useful they should really support
all of the neat features [creases, darts, arbitrary valences etc.] Getting
that into hardware will take some serious work from some of the best
graphics theoreticians out there.

>Personally, so far I’m glad to see that patches have been the
>first to make it to hardware as it makes most sense due to
>their support in modelling programs.

Agreed. This is an important first step.

>Oh, and while I may have your attention - in future can we
>please not loose a texture stage if we turn on a clip plane -

Well, yes… Right now the reason we’ve removed the clip plane support is
precisely because of this. The API is at it’s most useful when it’s

[Strictly speaking the clip plane support is still present in this driver,
we just don’t expose the relevant caps. And sooner or later you can expect
this kind of support to be removed for exactly the reason you state.]

Richard “7 of 5” Huddy
Developer Relations, NVIDIA Corporation.

Zak, any word on texture_shaders2 ?

Maybe the community should make clear, that there is a great demand for 3D textures.

BTW: i missed the discussion about clipping planes. I recognized some strange effects when using texture_shaders and clipping planes at once.
So is there a texkill operation employed for implementing clipping planes in the driver?
Any way to move around this, as our volume rendering approach require 4 texture fetches ?