Nothing rastered, gl_invalid_operation in 3.0

This one’s crunching my skull in six different ways, while this is a windows application/issue, I’m not certain it is an Operating-system dependant issue.

Some background:
You know those annoying kids (no offence intended) that come and post ‘Where I download OpenGL to make game? I have great idea that is gears of war crossed with pokemon’, that was me, four years ago.
Four years and enough coffee to (literally) give me a heart-attack later, I found the secret hidden download link :o
In short I’m making a cross platform game engine with all kinds of whistles and bells (given the last six failures I think I’ve got it figured out for this one >_>)
The reason I tell you this is that the fancy dynamic plugin parts and input things are all ticking away fine, as is (as it happens) the Linux variant of the renderer.
Now I’ve come to making the windows variant of the renderer, and I’ve hit a sticking point. As I said before, I’m not certain that this is a windows issue, I just know that it hasn’t happened in linux.
That said, the windows renderer is a tad more retarded simple.

The Problem:
Nothing shows up, not a single god-dammed vertex. The buffer is being cleared to the correct colour, the depth buffer is too. But nothing is being drawn to either.
I’ve checked the view is actually looking at the model (originally I was using a openGL 3.1 interface, but now a simple ‘gltranslate’ call with openGL 3.0 / openGL 2.1)
The render function, and model data are identical to the linux OpenGl 3.0 variant.
I assumed it was something simple, so ran it though gDEBugger, this is where it gets strange.
when I force it to use the 2.1 interface there are no errors reported, and the vertex data is processed, but once more never rasterised.
when I use the 3.0 interface Errors are pulled on every opengl command past ‘glmatrixmode’, virtually all of them are ‘GL_INVALID_OPERATION’.
The reason I find this VERY strange is the opengl-specs state that that will be reported if called inside a ‘glbegin-glend pair’.
But my program doesn’t have a single damn unholy begin/end pair, so why am I receiving this error?
Further, in 3.0 mode it appears that 'GL is discarding my draw commands (no vertex data is processed).
Obviously I realise this is depreciated functionality, but 3.0 still supports said functions, correct?

External Libraries:
glew32 (i’m lazy when prototyping >_>)

Code
I’ll throw my code up tomorrow (I’m dead tired and need to snip bits). But has anyone got any idea why this could be happening?
The code is virtually identical to the ‘creating a first triangle in opengl 3.0’ tutorial from the opengl-wiki.

What do you mean by 3.0 mode ? Do you use Core profile ?
In Core profile, those functions are deprecated.

From the spec:

Functions which have been removed will generate an INVALID_OPERATION error if called in the core profile or in a forward-compatible context.

If this is the issue, then why did linux render anything at all ?

I could bloody kiss you!
Apparently in my haste to run up a sloppy prototype I forgot a certain addition to my context: ‘WGL_CONTEXT_FLAGS_ARB, WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB’.
It had completely slipped my mind that GL_INVALID_OPERATION would be spit out in that situation.
Anyhow, that little change and it’s working (if at about 3-fps) I remain unsure why the 2.1 context doesn’t work. But I’m sure I’ll find some clues along the way.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.