Folks, nVidia’s dev relation group is unresponsive, so I’m posting the question here, and hoping for useful replies!
Look at this old S3TC texture compression sample, available here from nVidia:
The file s3tc.c basically requires that the extension WGL_EXT_swap_control (related to vsync) exist before running.
I think its a mistake. Anyone know otherwise? Any reason at all that the texture compression extensions would fail on hardware without that vsync extension?
I can’t test this fully. All our nVidia hardware has the vsync extension, and so the thing doesn’t fail.
My concern is that maybe there is some legacy graphics chip, circa 1999 or so, that has multiple flavors, with texture compression that is broken on versions without the vsync extension… Ideally, I can just ignore the fact that the thing is checked in their public sample code.