Texture borders on GL_RGBA32F textures (ATI bug most likely)

Hello,

I got a few textures created with the following setup:

   
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA32F_ARB, width, height, 1, GL_RGBA, GL_UNSIGNED_BYTE, NULL );

(width and height are 64 for my current tests)

This crashes when it tries to dereference a null pointer in the dirvers somewhere. Removing the texture border (i.e. from 1 to 0) solves the crash. (But obviously no borders :smiley: )
Imā€™m rendering to these using fboā€™s but it is not rendering to them that causes them to crash but binding them on a texture unit and reading from them.

Has anyone had similar problems? and nows how to kindof ā€œwork aroundā€ it?

Charles

Ah you have a fp32 texture type but you call GL_UNSIGNED_BYTE for a type? that should be GL_FLOAT

Originally posted by Mars_9999:
Ah you have a fp32 texture type but you call GL_UNSIGNED_BYTE for a type? that should be GL_FLOAT
Uh, you are aware that that parameter is only for the supplied pixel data right? So you can use what ever format you like (and considering he is using the empty pixel data array, it does not matter at all as there is no data to convert)

Originally posted by sqrt[-1]:
[quote]Originally posted by Mars_9999:
Ah you have a fp32 texture type but you call GL_UNSIGNED_BYTE for a type? that should be GL_FLOAT
Uh, you are aware that that parameter is only for the supplied pixel data right? So you can use what ever format you like (and considering he is using the empty pixel data array, it does not matter at all as there is no data to convert)
[/QUOTE]Ok, note to self. I still pair the types up just in case down the road I change to upload the data insteadā€¦