128X128 texture causes CRASH!

I use NVidia Riva 128 card.
When I call glTexImage2D(GL_TEXTURE_2D, 0, 4, 128, 128, 0, GL_RGBA, GL_UNSIGNED_BYTE, PPBits^);
It crashes in NV3OGL.DLL - access violation,read. In Software it crashes too.
But OpenGL says about 1024 when I do glGet(GL_MAX_TEXTURE_SIZE).
When I call it with 128X64 size all is ok(and even fast).
Cant I determine real Max texture size without crash?


I can’t believe a such bug could exist in a driver for a such card and never been reported before. IMO you are doing something wrong. Check out the pointer to the texture pixels you’re passing in arguments.


Well, this is the RIVA 128 we’re talking about now.

But no, we should easily be able to handle a 128x128 image, I would think…

  • Matt

Maybe you have’t allocated enough memory for your image, and the driver tries to read outside the allocated space, causing an access violation.

Even though this shouldn’t cause a crasch I’d suggest that you replace that “4” with your internal format of choise (for example RGB8).

In my experience, crashes are most likely the result of accessing memory that you shouldn’t be accessing. Like going over the bounds on a dynamically created array. You should definitely take Bob’s advice and check that you have memory allocated for the pointer you’re passing in.

edit note
Oops… meant to say crashes are the “result” not the “cause”

[This message has been edited by Deiussum (edited 01-16-2001).]

Fixed. I passed invalid pointer. It was working only when texture was resized.
BTW Riva 128 can render 1024X1024 - with 5.83 fps.(when changing texture each time).


P.S Indirect3D is alife! I’ve written its concepts at www.geocities.com/udodenko/ind3d.html

P.P.S Is anybody interested in writting his own 3D file manager visualization? We’ve implemented 3d filemanager core, so if somebody needs…

[This message has been edited by RandyU (edited 01-16-2001).]