WGL_ARB_render_texture crash

http://tyrannen.starcraft3d.net/WorkingRenderTexture.zip

here is a working demo
from top left to bottom right:

a simple texture2d, created with some for-loop and rotating (Bruno asked a question about this… here is the proof it works that simple )

next, no glTexture2D enabled => white (wow… was a big topic! till i realised this )

my rendertexture2d got rendered and is presented on quad number 3. (its only rendered on 64x64 to see the texels)

a simple frustum with a cube with my texture in…

green background

and its yet nearly 64kb big… i’m not a good demo-coder am i?
(and if you want, take a look at the big lovely logfile )

Hey, that logfile was nice and juicy lol. But its not the size of the demo, its the colors!

Heya, that’s odd:

LogFile by davepermen

thanks to these web-resources { nehe.gamedev.net www.flipcode.com www.libsdl.org www.opengl.org www.nvidia.com www.ati.com } in memories of { www.starcraft3d.net } great stuff { www.cfxweb.net/~projectfy/ www.nutty.org http://esprit.campus.luth.se/~humus/ } Logging of the Program { MainSettings::Init(main.cfg) { OpenGL::Create(640,480,32,0) { SDL OpenGL initialized successfully OpenGL Version 1.2 is minimum requirement, but not supported } } Texture2D::create() { Texture ID: 1 } RenderTexture2D::create() {

And then it crashes, window borders are already there, client area is not. Invalid something in module <unknown>. What’s particularly strange is the version thingy. Here’s a quote from my own logs

> switched to 640x480 fs=0 style=0x14c00000
> desktop resolution is 1024x768 pixels, aspect ratio 1.333
> r_init
-  GL driver: Radeon 7200 DDR x86/MMX/3DNow!/SSE
-  version: 1.3.3031 Win9x Release
-   stack depths (MODEL,PROJ,TEXTURE): 32, 10, 10

And yes, pbuffer and render_texture extensions are supported.

PS: Shhh, don’t tell ATI about my driver version

the crash will not be in in the final engine for sure…

the problem is simple:
i initialise gl1.2 functions, and one or severall of them don’t exist on the radeon, namely, i think, the paletted textures are not supported…

i’ll drop this out sometimes to just load the extensions i need…

and… even if you would have 1.2, it would crash afterwards cause it initialises some nvidia-only-extensions, namely vertexprogram,registercombiners and texturerectangles…

Originally posted by davepermen:
[b]the crash will not be in in the final engine for sure…

the problem is simple:
i initialise gl1.2 functions, and one or severall of them don’t exist on the radeon, namely, i think, the paletted textures are not supported…

i’ll drop this out sometimes to just load the extensions i need…

and… even if you would have 1.2, it would crash afterwards cause it initialises some nvidia-only-extensions, namely vertexprogram,registercombiners and texturerectangles…[/b]

Ok, I can accept that. But it’s still a bit strange that your program complains about missing GL1.2 support when I really have a GL1.3 driver, as you can see in my own log file. Don’t you think?

ati left out some functions for gl1.2, thats the problem…
as i will not use paleted textures anyways, i’ll drop them out so you’ll get your 1.2 features back

ok… well…

for who is this now working?
http://tyrannen.starcraft3d.net/Test.zip

it should look like this:
http://tyrannen.starcraft3d.net/realscreen.jpg

but my pc is the only one it worked yet

on the ati’s i got a crash-report, on another geforce (geforce3 of BigB) i got a “green-rectangle with black borders” report…
what do you see?

I just get blue to green bordered rectangle.

Originally posted by davepermen:
ati left out some functions for gl1.2, thats the problem…
as i will not use paleted textures anyways, i’ll drop them out so you’ll get your 1.2 features back

If you’re talking about glColorTable et al, that’s not strictly core functionality. It belongs to the imaging subset which is optional for implementations. NVIDIA (all caps!) have it, ATI don’t, neither do they support EXT_paletted_texture.

Maybe you should take another look at how you do things, I wouldn’t recommend using imaging subset stuff.

i dont use them. i just used the nvidia headers for initialising opengl1.2 and there they are… i will remove everything i dont need for sure…

Grüezi Dave

Works on my PIII 600 GeForce1 DRR 28.32.
Rotating cube on the left and triangle on the right side. What kind of blurring are you computing for the triangle?

kon

grüäziwohl kon(rad?)

i’m using the same blurring technique as for the cube on the left… is that enough info?

nope, i’m using the technique described by ati on theyr paper about how to get glow working…

(link can be found in the big thread about glow…)

Dave, do you render each texture in different sizes or are you using automatic mipmap generation with ext_texture_lod_bias to get different blurred versions of the texture?

kon

P.S.: How many persons do you know in Switzerland with the name Konrad? I give you a second chance!

Well, just made some tests with ext_texture_lod_bias to get different blurred versions of a texture. Cool and fast.

kon