Rectangle Textures (Once again)

Yo! I’m currently (had forgot to do so) adding rectangle textures to my engine. It’s C#.

I’m using:

    private void SelectRectRenderTarget()
    {
      if(rrt==-1)
      {       if(GLHelper.SupportsExtension("GL_NV_texture_rectangle")) rrt=Gl.GL_TEXTURE_RECTANGLE_NV;
        else if(GLHelper.SupportsExtension("GL_EXT_texture_rectangle")) rrt=Gl.GL_TEXTURE_RECTANGLE_EXT;
      }
    }
    public int GetRenderTarget()
    {
      if(rrt==-1)
      {
        SelectRectRenderTarget();
        rrt = isPOTD(width, height)?Gl.GL_TEXTURE_2D:rrt;
        if(rrt!=Gl.GL_TEXTURE_2D) this.targetTextureType = TextureTypes.tRect2D;
      } 
      return rrt;
    }
    private bool isPOTD(int w, int h)
    {
      return (
          ((w & (w-1)) == 0)
       && ((h & (h-1)) == 0)
        );
    }

This works ok, I also use:

</font><blockquote><font size=“1” face=“Verdana, Arial”>code:</font><hr /><pre style=“font-size:x-small; font-family: monospace;”> public void InitializeOnHardware()
{
if(bmp!=null)
{
int tID = 0;
bmp.InitializeOnHardware(ref tID);
renderTarget = GetRenderTarget();
if(renderTarget == Gl.GL_TEXTURE_RECTANGLE_NV

Sorry, UBB doesn’t like bolds within code :slight_smile:

Yo! I’m currently (had forgot to do so) adding rectangle textures to my engine. It’s C#.

I’m using:

    private void SelectRectRenderTarget()
    {
      if(rrt==-1)
      {       if(GLHelper.SupportsExtension("GL_NV_texture_rectangle")) rrt=Gl.GL_TEXTURE_RECTANGLE_NV;
        else if(GLHelper.SupportsExtension("GL_EXT_texture_rectangle")) rrt=Gl.GL_TEXTURE_RECTANGLE_EXT;
      }
    }
    public int GetRenderTarget()
    {
      if(rrt==-1)
      {
        SelectRectRenderTarget();
        rrt = isPOTD(width, height)?Gl.GL_TEXTURE_2D:rrt;
        if(rrt!=Gl.GL_TEXTURE_2D) this.targetTextureType = TextureTypes.tRect2D;
      } 
      return rrt;
    }
    private bool isPOTD(int w, int h)
    {
      return (
          ((w & (w-1)) == 0)
       && ((h & (h-1)) == 0)
        );
    }

This works ok, I also use:

</font><blockquote><font size=“1” face=“Verdana, Arial”>code:</font><hr /><pre style=“font-size:x-small; font-family: monospace;”> public void InitializeOnHardware()
{
if(bmp!=null)
{
int tID = 0;
bmp.InitializeOnHardware(ref tID);
renderTarget = GetRenderTarget();
if(renderTarget == Gl.GL_TEXTURE_RECTANGLE_NV

Doh, it just doesn’t seem to like me: ok, lemme continue.

</font><blockquote><font size=“1” face=“Verdana, Arial”>code:</font><hr /><pre style=“font-size:x-small; font-family: monospace;”> public void InitializeOnHardware()
{
if(bmp!=null)
{
int tID = 0;
bmp.InitializeOnHardware(ref tID);
renderTarget = GetRenderTarget();
if(renderTarget == Gl.GL_TEXTURE_RECTANGLE_NV

BLAH, it’s all the “or” operator fault :confused:

    public void InitializeOnHardware() 
    {
      if(bmp!=null) 
      {
        int tID = 0;
        bmp.InitializeOnHardware(ref tID);
        renderTarget = GetRenderTarget();
        if(renderTarget == Gl.GL_TEXTURE_RECTANGLE_NV [b](or :D )[/b] renderTarget == Gl.GL_TEXTURE_RECTANGLE_EXT)
        {
          textureMatrix = Matrix.Identity;
          textureMatrix.Scale = new Vector(this.bmp.Width, this.bmp.Height, 1);
        }
        textureID = bmp.TextureID;
      }
    }

And finally, when binding the texture:

    public virtual void SetTexture() 
    {
      UnSetTexture();
      Gl.glBindTexture(renderTarget, textureID);
      Gle.glEnable(renderTarget);
      int mMode;
      Gl.glGetIntegerv(Gl.GL_MATRIX_MODE, out mMode);
      Gl.glMatrixMode(Gl.GL_TEXTURE);
      Gl.glLoadMatrixf(bmp.TextureMatrix);
      Gl.glMatrixMode(mMode);
    }

The texture matrix is scaled to the pixel size so the rectangle texture extensions should work without modification on texture coordinates (without using VP)… but it just doesn’t.

It doesn’t seem to multiply and I always get the first pixel as the whole texture (when using normalized texture coordinates).

In my last C++ engine, I used to multiply by hand on each texture coordinate call, and thought this would be the solution so I don’t have to change the whole engine where texture coordinates are used…

Has anyone ever tried this? I’ve been googling and browsing the forum for this and haven’t found anything.

Should I just use POTD and make the texture matrix scale inverse? (i.e., textureSize/bitmapSize)… I’d rather use the rectangle extensions… ARB_texture_rectangle is not supported on the card I’m using (FX5600), and I still want it to have backwards compatibility with at least GF4, so it’s not a solution right now.

Any help? Thanks!

BTW, sorry moderators… feel free to join the post in just one… the “or” operator within a code section seemed to cut my message down…

Sorry for being a n00b :rolleyes:

Nevermind, it worked just fine… I was setting a texture matrix, and loading other into opengl… grr