Texture coord mapping problem

i do not know how to get correct 2D texture coord in GPU from the texture created from 2-d array in CPU.
For example

GLuint edge[3][3][4]=
{
1,0,0,1,
0,1,0,1,
0,0,1,1,
0,0,0,1,
0,0,1,1,
0,0,1,1,
0,0,1,1,
0,0,1,1,
0,0,1,1
};

glGenTextures(1,&tex1);
glBindTexture(GL_TEXTURE_2D,tex1);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
//Give it the data
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,3,3,0,GL_RGBA,GL_UNSIGNED_BYTE,edge);

Then i want to access the edge[1][2].rgba,what is the texture coord of edge[1][2].rgba in texture?

How can i use tex2D(sampler,float2(what?,what?)) correctly?

glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,3,3,0,GL_RGBA,GL_UNSIGNED_BYTE,edge);

Try using glTexImage3D instead.
Supply the usual 2D texture coordinates to your shader like you normally would for any regular 2d texture. The 3rd texture coord you use for the shader’s texture lookup function will be the ‘slice’ or ‘layer’ of the 3D/2Darray texture.
Unlike Texture3D, however, array textures just take a linear value for the slice, eg slice=0 is the first layer; slice=1 is the second layer, etc.

So in your shader, something like:


#extension GL_EXT_texture_array : enable
myData = Texture3D (sampler2DArray, vec3 (texco.st, slice) );

The actual syntax of the GLSL depends on the version of course.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.