render to texture extensions

Since the new NVIDIA driver release (29.42) I have got problems with the render to texture extensions. wglGetPtocAddress(…) returns NULL for wglBindTexImageARB(…), wglReleaseTexImageARB(…) and wglSetPbufferAttribARB(…). Has anyone else the same problem or has anybody an idea from where this comes? Here a code snoppet of the initialization function:

glGetString(GL_EXTENSIONS);

wglGetExtensionsStringARB = (PFNWGLGETEXTENSIONSSTRINGARBPROC)wglGetProcAddress(“wglGetExtensionsStringARB”);

wglGetExtensionsStringARB(hdc);

wglBindTexImageARB = (PFNWGLBINDTEXIMAGEARBPROC)wglGetProcAddress(“wglBindTexImageARB”);

wglReleaseTexImageARB = (PFNWGLRELEASETEXIMAGEARBPROC)wglGetProcAddress(“wglReleaseTexImageARB”);

wglSetPbufferAttribARB = (PFNWGLSETPBUFFERATTRIBARBPROC)wglGetProcAddress(“wglSetPbufferAttribARB”);

[This message has been edited by gerogerber (edited 06-18-2002).]

[This message has been edited by gerogerber (edited 06-18-2002).]

Have you tried initializing extensions with glh_init()?

need to include <glh_extensions.h>

No, but these extensions ARE available on my card, because the corresponding NVIDIA demos work. And in my point of view glh_init() does nothing else than my init function.

Are you saying that other extensions load fine, but these ones dont?
Im assunming you have a current RC.

You shouldn’t need this
wglGetExtensionsStringARB(hdc);

V-man

Yes all other extensions I need, load fine only the mentioned ones are NULL.
I also have a valid device context and render context.

The nVidia demos might have fall-back-to code that works even if the extensions are there. The do-all solution is to look at the result of glGetString(GL_EXTENSIONS); (you have it there in your code, but you’re ignoring the return value; make a char pointer and assign the return value of glGetString(GL_EXTENSIONS) to it and print it out,) maybe even write a routine that will parse it looking for the extensions.

No no no. Before this driver release hardware shadow mapping was working. Since this new driver hw shadow mapping doesn’t work anymore. I know the result of glGetString() and wglGetExtensionsStringARB() because I check it at another place in the code. The reason why wglGetExtensionsStringARB() is there is that it seems to do some initialization which must be done before creating the pbuffer.

gerogerber,
what does GetLastError() report if the return value of wglGetProcAddress is NULL?

good hint! GetLastError() returned ERROR_PROC_NOT_FOUND. Mmm, does anybody know what this exactly tells me?

have you tried it without the “ARB”? possibly they just started rewriting everything for gl1.4

only a joke…

Same error code.

I found the problem, if anybody is interested:

before intializing the render to texture function pointers wglGetExtensionsStringARB(HDC hdc) has to be called. The problem was that I passed the return value of wglGetCurrentDC() to wglGetExtensionsStringARB(HDC hdc). This didn’t work. Now I use the device context MFC returns to me via CView->GetDC()->m_hDC. This works!

gerogerber,

The problem where wglGetExtensionsStringARB had to be called before getting wgl extension function pointers has been fixed and should be in the next driver release.

What do you think of Cg, jra101?

Originally posted by knackered:
What do you think of Cg, jra101?

I think its great, takes a lot of the pain out of writing vertex/pixel shaders. I’m a little biased though

Originally posted by jra101:
[b]gerogerber,

The problem where wglGetExtensionsStringARB had to be called before getting wgl extension function pointers has been fixed and should be in the next driver release.[/b]

Jra101, as you are active also on the Linux side, do you plane to make render-to-texture GLX extensions too?

Yes, good to hear that it will be fixed in the next driver release. But currently nobody really says that you have to call wglGetExtensionsStringARB(…) before initializing render to texture functions.

[This message has been edited by gerogerber (edited 06-21-2002).]