I’m working on converting my little OpenGL game from 1.* to 3.1, and it has mostly been going fine, except for the fact that gluOrtho2D/glOrtho seem to have no effect on the coordinates.
I’m trying to use the line “gluOrtho2D(-10.0, 10.0, -10.0, 10.0);”, and I’ve tried putting it in all kinds of places in the code, just to make sure it wasn’t in the wrong place. But the coordinate system is still -1 to 1. I’ve tried calling gluOrtho2D before setting shaders, before/after glBindBuffer etc, and before glBindVertexArray/glDrawArrays, and it makes no difference.
Obviously I am an OpenGL newbie and I’ve only been using 3.1 for about a day, so I’m probably missing something very basic.
I’d post the code, but it’s not very compact, so I don’t know if I should. So, my question is, does anyone know where the gluOrtho2D line is meant to go in a 3.1/3.2 application that’s trying to avoid using deprecated functions?
How can I make sure they’re using the standard matrices, beyond not setting glOrtho until after I’ve loaded the shaders? It sounds like this might be my problem. Ordinarily I would google this stuff, but I have a hard time finding 3.* tutorials.
It doesn’t work though, no matter where I place it. Should I just be scaling the vertex array positions by the values I want, before passing them to the shader, perhaps? That doesn’t sound like a great way of handling this, but I don’t even know if it’s possible to get glOrhto working with shaders.