Confusing problem with drawing first primitive

There is no simple way to explain this but basically my program once in a while gets a triangle drawn on screen but many times I load it up and it doesn’t.

Right now I am trying to learn the basics of on screen coordinates and just basic primitive drawing. I am starting in 3.3 OpenGL just so I don’t learn out of date things with 2.0 or lower programs.

My thinking of the problem lies with the fact that I am not using transforms at the moment BUT I am trying to just spit things out on screen with base OpenGL screen coordinates which I think is possible? AKA Make a triangle onscreen that won’t move like I see in many tutorials. I don’t have a link but this is basically a slightly modified tutorial. Here is my code

Apologies ahead of time because it is in JAVA and in LWJGL but it matches up with c and c++ about 99%. It will look slightly different but it uses the exact same commands. Just wondering what I am doing wrong and if what I am trying to do is not possible without transforms which I am still learning.

testvertex[0] = 0.0f;
    testvertex[1] = 0.0f;
    testvertex[2] = -1.0f;
    testvertex[3] = -1.0f;
    testvertex[4] = 0.75f;
    testvertex[5] = 0.75f;
    ByteBuffer bytevertex = ByteBuffer.allocateDirect(24);
    FloatBuffer vertexbuff = bytevertex.asFloatBuffer();

    glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
    glDepthRange(0, 1);

    programid = glCreateProgram();
    vertshader = glCreateShader(GL_VERTEX_SHADER);
    fragshader = glCreateShader(GL_FRAGMENT_SHADER);
"#version 330 core 
 in vec2 inPosition; 
 void main() { gl_Position = 
vec4(inPosition, 0.0, 1.0); }");
"#version 330 core 
 out vec4 fragcolor; 
 void main() { fragcolor = 
vec4(0.0, 1.0, 0.0, 1.0); }");


    glAttachShader(programid, vertshader);
    glAttachShader(programid, fragshader);
    glBindAttribLocation(programid, 0, "inPosition");

    idVAO[0] = glGenVertexArrays();
    idVBO[0] = glGenBuffers();
    glBindBuffer(GL_ARRAY_BUFFER, idVBO[0]);
    glBufferData(GL_ARRAY_BUFFER, vertexbuff, GL_STATIC_DRAW);
    glVertexAttribPointer(0, 2, GL_FLOAT, false, 0, 0);
    glBindBuffer(GL_ARRAY_BUFFER, 0);

    int frames = 0;
    //Draw code here in loop the frames == 20,000 is to make
 a simple quick way of not drawing super fast for no reason.
    while (!Display.isCloseRequested()) 
        if(frames == 20000)
        glDrawArrays(GL_TRIANGLES, 0, 3);
        frames = frames - 20000;
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glDetachShader(programid, vertshader);
    glDetachShader(programid, fragshader);

For the shader i had issues loading it as a file because my loader picks up random junk so I have them as a raw String to get put into. They compile fine no errors.
They just take two in coordinates x and y and auto set w and z.
Frag shader just makes it a solid green triangle because i dont need fancy shaders for a long time =D

Basicly I start the display and setup basic stuff. I make compile ect the shaders and add to program. They all compile fine and it DOES draw sometimes.

Other funny things like it draws fine always with GL_INT even when I feed it floats but with GL_FLOAT it only sometimes displays. Does it prove that my viewpoint changes randomly between closing and opening?

I am just trying to output the primitive to actual screen coordinates right now because I am just learning the basics of drawing a primitive. Later I will get into transformations ect.

Mini edit – the while loop is just to slow down drawing and I don’t need help making it run at x fps I know how to do that. I already have a working platformer in Java2D but I am trying to learn OpenGL to eventually make my own 2D engine for fun but also to learn the basics and maybe move onto 3D once I learn a lot more. I just need help with the OpenGL side

Thanks for any help in advance and sorry for a wall of text.

Just to add on because it is a wall of text is this simple question.

Can you output primitives directly to screen coordinates? AKA the corners of the screen should lie at top left -1x 1y, top right 1x, 1y, bottom left -1x, -1y bottom right 1x, -1y.

Did those coordinates change in 3.x+? I have read many tutorials but many are for 1.1-2.0 OpenGL so I could have gotten confused. I am starting with 3.3 and thus most of the old tutorials are deprecated and thus little use for me.

Basic way I draw is these quick steps

  1. Make window and context(not shown in my code)
  2. Make program and add shader code plus attach them to the program. This works fine no errors compiling
  3. Make VAO to bind VBO with array data for the vertexs. Add to in vec2 attribute in vertex shader.
  4. Draw in loop to keep having it drawn.

I think that should work. I was unsure if I need to unbind the buffer or vertex array object after putting them together or if you should not unbind in such a simple program.

Thanks in advance.

I think that in order for the triangle to render, it has to have a surface area that would fill some pixels on the screen.

The vertex data in your code describes a triangle with 3 points that rest on the same line -> 0 surface area -> 0 pixels.

By default, the vertices that you provide will be used directly in the viewport, unless you explicitly transform them by one matrix or another (I think this is true regardless of GL version, but I’m not really a guru, so I can’t say for sure).

Anyway, if you’re looking for a modern OpenGL tutorial, try this one:

It’s using C++, but that shouldn’t really matter if you’re interested in the principles (which is, refreshingly, what this tutorial is focused on).

I’m sure you’ll be able to get some insights there even in your java environment (I personally use C).

Thanks for the input. From what I understand my code should just use raw screen cordinates and unless i am wrong my code should be on more then 1 line unless you mean in the Z. I have tried it with all ranges from -1 to 1 Z. I have read many tutorials even the new ones for 3.0+ and the basic starter ones use -1 to 1 for screen cordinates before applying transforms. Plus this is basicly a 3.1 tutorial without a 2nd triangle using element drawing because I don’t quite understand the full difference.

Another weird thing is if i use floats and us GL_INT which happened no accident it always draws but really small, most likely because it is the wrong data type.

And it SOMETIMES will draw but its random plus it will change. It seems like my viewpoint is ever changing but I dont see why it would. I even set glViewport(0,0,800,600) which is my current window size. My library already sets it to the window dimensions with the screen being at 0,0.

If you port my code into C does it work fine? Maybe its a driver issue?(mine is up to date but that doesn’t always mean anything).

I would add a transform but I am trying to make sure I understand raw screen cordinates so i have a baseline to work from. I’ve read tutorials on viewpoints in general and I get the difference between ortho displays and perspective. I definately don’t understand the math enough right now to make my own system.
It would help greatly to get some basic primitives going then work my way up :slight_smile:

Thanks in advance

-edit- Oh just for more info I have set it up to even use 3 or all 4 cords and just have my vertex do nothing but set the value to what I used. vec3 to vec4 if i used xyz and vec4 to vec4 for xyzw. Nothing has changed it to work consistently. I don’t want this to turn into a here is my code fix it I just need some direction on the general thing that is likely wrong. I’ll play with random cordinates some more.

This is your triangle in 2D space (according to the code snippet you posted above):
[(0.0, 0.0), (-1.0, -1.0), (0.75, 0.75)]

Here’s what the viewport coordinate system looks like:

             (0, 1)
(-1, 0) -------+------- (1, 0)

Your triangle vertices in that coordinate system:

A = (0.0,   0.0)
B = (-1.0, -1.0)
C = (0.75, 0.75)

             (0, 1)
               |   C
(-1, 0) -------A------- (1, 0)
        B      |

That’s a line -> no surface area for your triangle -> no pixels.

I think this is what’s confusing you: You’re expecting to see something rendered for a situation that would naturally not render anything, for the previously mentioned reason (all three triangle vertices are on the same line, and therefore have no surface area to actually render, so nothing shows up).

-quick update- Even just tried the tutorial on the techincal wiki here’s cordinates and still no luck. I went through again and again checking my code to theirs and it to me looks fine. I don’t use the element drawing but that shouldn’t change anything. Just means I will have 1 triangle vs their two. Running out of ideas what it could be =\ The f just makes the compiler use the value as a float otherwise it turns it into a double if that is confusing.

Also just in case drawing too much was problem I set it up to only run at 30 fps instead of the insane amount it was before. Still no luck. Took the code straight from my Java2D platformer so i know that code works.

    testvertex[0] = -0.3f;
    testvertex[1] = 0.5f;
    testvertex[2] = -1.0f;
    testvertex[3] = -0.8f;
    testvertex[4] = -0.5f;
    testvertex[5] = -1.0f;
    testvertex[6] = 0.2f;
    testvertex[7] = -0.5f;
    testvertex[8] = -1.0f;

Oops silly mistake but it still won’t draw even with it not being in a straight line. I am actually following the OpenGL 3.1 The First Triangle (C++/Win) tutorial on the site. Even with their cordinates nothing and just right now i used these vertices below.
-edit- Also I have followed their vertex points to a letter with no success. I am wondering if it’s an issue with the floatbuffer but I don’t know. LWJGL doesn’t like normal arrays and I have had issues with the VBO and VAO links using an intbuffer before so I used the overloaded array glGenVertexArray and the respective buffer command to get proper id’s to link with.

Right now I am trying to find out if it’s an issue with my GL commands or my understanding and using of the java part. There is no array based glBindBuffer command that I can find so I MUST use a buffer.

    testvertex[0] = 1.0f;
    testvertex[1] = 0.5f;
    testvertex[2] = 1.0f;
    testvertex[3] = 0.2f;
    testvertex[4] = 0.8f;
    testvertex[5] = 0.0f;
    testvertex[6] = 0.2f;
    testvertex[7] = -0.7f;
    testvertex[8] = -0.8f;

Should make it go half up and all the way right for first vertex, then up a little more and move left to 0.2 of the screen. Then down to 0.7 of the bottom half of the screen while staying in line with 2nd vertex. Not a perfect triangle but it doesn’t mind as long as its triangle form right?

That would not be on a straight line. I re did my vertex shader to take in a vec3 again so I could modify the z to make sure it wasn’t a straight line and had some depth. I can set the clear color to anything and that works fine. My Floatbuffer works fine from my perspective in that it only stores the values i give it, they are all in the proper order and all the exact values i gave.

#version 330 core 
in vec3 inPosition; 
void main() 
gl_Position = vec4(inPosition, 1.0); takes in xyz adds 1.0 to w

#version 330 core 
out vec4 fragcolor; 
void main() 
fragcolor = vec4(0.0, 1.0, 0.0, 0.0); spits out green

just to make the shader code easier to read.

    idVAO[0] = glGenVertexArrays();
    idVBO[0] = glGenBuffers();
    glBindBuffer(GL_ARRAY_BUFFER, idVBO[0]);
    glBufferData(GL_ARRAY_BUFFER, vertexbuff, GL_STATIC_DRAW);
    glVertexAttribPointer(0, 3, GL_FLOAT, false, 0, 0);

Looked it over many times and still looks right. Confused what could be the problem. I have my loop but took out the frames code so it just draws then updates instantly. Thanks for the help so far. It follows the tutorial almost word for word other then my own named variables and slight difference in java vs C.

Sorry if it’s something simple and I am just blind =\

Sorry for all the edits as well but here is another. I just want to give as much information on what I have tried to reduce confusion and give a more specific idea what is happening.
Just for more information I have these two commands to make sure it gets drawn period no matter if its front or back facing

glPolygonMode(GL_FRONT_AND_BACK, GL_FILL); make it fill and do both front and back
glDisable(GL_CULL_FACE); turn culling off so it will be drawn regardless of what way it is facing

Well, wait. You said that your program “once in a while gets a triangle drawn on screen” (in your initial post) - so, what was the code that actually worked?

I would start from there.

Also, I don’t think there is any need to complicate things with an additional coordinate; 2D is fine for now - you can use these vertices to make sure that you’re drawing a visible triangle:

[(0.0, 0.0), (-1.0, -1.0), (1.0, -1.0)]

I’m allergic to java, so I can’t really help you there. But, needless to say, if there is something wrong with the mechanism used to swap buffers, then you definitely wouldn’t be able to see anything, even if all the OpenGL stuff was correct (in java terms).

You might want to think about a switch to C++ in that case - I assume that the C++ tutorial you’re referencing compiles and works properly.

I’m on Linux, so I can’t test it myself, but you should be able to.

Well thanks a ton for the help so far. I will try and search more on their forum for Byte/Float/Double/Int ect buffer troubles and see if anything turns up.

Once and a while it will draw something but nothing like the tutorial shows and it works if I tell OpenGL that I am using GL_INT even though I give it floats. The triangle fits on screen but is VERY small and it will change each time I quit and reload the program.

Once in a while it will draw with GL_FLOAT and feeding it floats BUT it will not look like the tutorial and most of the triangle fits off screen. I have a few screenshots but I don’t remember or write down the vertexs I used. It was back when I did ints and GL_INT. Not sure what this proves. Also it will change each time it is loaded so I don’t know if my viewport keeps changing or if it was just a fluke.

Since it seems to be fine on the OpenGL side I’ll read up on buffers and see if there is anything wrong I am doing. My context to me seems to work fine. Changing the screen clear color works fine and it crashes if I try to use OpenGL 4.0+ because my chipset doesn’t support it. I have it set at 3.3 the highest mine supports. I even found out how to test for supported version + extensions.

I don’t know C++ too well but I know enough to get an idea what that tutorial was doing and to me, my Java matches up just fine vs what they did in C. The LWJGL is supposed to match up almost 100% to the C version with some minor differences because obviously they operate slightly differently. It runs off a native build for Mac, Windows, and Linux and calls the native function for you. So it should port fine to any system.

Hopefully I can get back in a day or two with what was causing me trouble.

To give an example on the IntBuffer error making a VAO and VBO, it would give me an address in the 100,000 to 300,000 range while the way I do it now I get address 1 for my VAO, and 1 for my VBO. I do not know if a VAO and VBO have seperate lines of addresses or if it is shared. Maybe that is why I have issues as well. Thanks for the help so far and for any other who post in advance.

Sorry I couldn’t be more helpful.

Anyway, be sure that your viewport dimensions match your screen size. So, if you have a 640x480 window, you should call glViewport(0, 0, 640, 480) - all my advice assumes that this is in fact what you’re doing.

Also, regarding “buffer” problems - I don’t know how java treats floats internally, but, using the GLfloat (or, I guess, “GL_FLOAT” in java) would seem like the right thing to do.

And if using GL_FLOAT actually draws the triangle, even if it’s a little off screen, then I would say that your viewport dimensions are set to something higher then your screen size, which would basically explain why that happens.

Not a problem I know it’s hard to assist with a different library and language. I came for just OpenGL side here assuming my Java side was handled properly but it is looking like I am doing something wrong java side.

On the viewport it should default to the window size by itself but I have
glViewport(0, 0, 800, 600); added in my code to make sure it gets set right. I use 800x600 window.

Personally I think my few lucky draws are a fluke because it only sometimes works. Like I’ll start the program up and maybe 50% or less will draw something distorted.

On the float issue I can’t say for sure. Java only uses signed values no unsigned and it’s 4byte size for a float. I’m off to bed for now but I’ll try and see what java side issues are likely my problem now. Thanks again for the assistance.

I think I have an idea what is up. I was thinking at work and just got back and pulled out my 3.3 specifications pdf which lists the floating point data values that opengl likes.

Opengl uses 16 bit floating point values. If I’m right that is 2 bytes. 8 bits = 1 byte.

Java uses 4 byte floating point numbers aka 32 bit (4*8bits). That would explain why I cant get integers to semi work but not floats. And even integers in java are 4 bytes which may explain my trouble.

I’ll have to try using bytebuffers raw without int or float buffers and see what I can get working. I’ll get back hopefully tonight with a working primitive :slight_smile: Thanks for the advice so far.

I think all my opengl code is solid considering if i tell it GL_INT it draws a triangle, just not right. That tells me it is working but opengl doesn’t like the numbers i feed it or the way I feed it to it. Wish me the best.

-edit- Another reason I feel this is true is this. When I use commands to get values back out from OpenGL I get funny numbers sometimes. Esepcially with things larger then 0.0 or 1.0 like GL_MAX_DIMENTIONS to get the max height and width allowable. I get my viewport 0, 0, fine but the last two numbers, no matter if i use ints floats doubles ect, come out to really strange things like 3.050243509 to the negative 34th power.


I don’t know how to explain it but this single peice of code magically makes it work.

   ByteBuffer bytevertex = ByteBuffer.allocateDirect(36);
//allocate enough bytes for 9 vertices 4byes per float * 9
    bytevertex.order(ByteOrder.nativeOrder()); //the change
    FloatBuffer vertexbuff = bytevertex.asFloatBuffer();

For whatever reason making the bytes go into native order makes OpenGL happy :slight_smile: I found this answer on the forum when others had trouble getting their old ByteBuffers working with the newer LWJGL but thanks for the help in finding out it was my Java code and not OpenGL that was causing the trouble.

I can make the triangle any shape and size now just fine as longa as I keep in mind not to try drawing a line :slight_smile: I keep making that silly mistake.

Oh and I am using all floats and from reading I was wrong since OpenGL does convert 32bit floats to whatever it needs internally. So normal Java floats work fine.

Heh, see, this is exactly why I’m allergic to java.

Anyway, I’m glad you managed to solve this (persistence is half the battle).