Can't render a triangle

The output screen is just black without a triangle. When tried for a simple dot, the output showed up as a tiny dot. Why?

This is the code:

#include "math_3d.h"
GLuint VBO;
static void CreateVertexBuffer()
    Vector2f Vertices[3];
    Vertices[0]=Vector2f(-1.0f, -1.0f);
    Vertices[1]=Vector2f(1.0f, -1.0f);
    Vertices[2]=Vector2f(0.0f, 1.0f);
    glGenBuffers(1, &VBO);
    glBindBuffer(GL_ARRAY_BUFFER, VBO);
    glBufferData(GL_ARRAY_BUFFER, sizeof(Vertices), Vertices, GL_STATIC_DRAW);
static void RenderSceneCB()
    glBindBuffer(GL_ARRAY_BUFFER, VBO);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);
    glDrawArrays(GL_TRIANGLES, 0, 3);
int main(int argc, char** argv)
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA);
    glutInitWindowSize(1024, 768);
    glutCreateWindow("Tutorial 3");
    GLenum res= glewInit();
    if(res!= GLEW_OK)
        fprintf(stderr, "Error: %s 
 ", glewGetErrorString(res)); // check for errors in GLEW
        return 1;
    glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
    return 0;

Without this code being clarified I won’t feel like going on ahead. Is it something wrong with my code or something that I’m missing in my linux system?

Where did you get it?
What have you done to try and debug the problem?
Also, if you’re going to post a short stand-alone test program, you might make sure you remove dependencies on all headers you’re not posting so folks can compile it.

That said…

I modified your program so it would compile and run here on my Linux box, and it has several problems. The first is: you are providing an array of 3 vec2s in the position VBO. However, you’re telling the GPU to fetch 3 vec3s out of the buffer object (GL_FLOAT[3]). Basically you were telling the GPU to read past the end of the buffer object by 3 floats. See your glVertexAttribPointer call for the error.

Change that 3 to a 2, and you’ll get your white triangle…

…on an NVidia GL drivers.

That brings us to the second problem. This code is using generic vertex attributes, but it’s using the fixed-function pipeline for shading (the fixed-function pipeline uses its own, implicit shaders, instead of ones you provide). This code is presuming vertex attribute aliasing between the legacy vertex attributes (old fixed function pipeline) and the generic vertex attributes. NVidia does this (see this link), but the GL spec does not require this, and thus not all GPU vendors’ GL drivers do.

If after making the prior change you still don’t get your triangle, then you have a choice:

  1. Switch from using generic vertex attributes (e.g. glVertexAttribPointer/glEnableVertexAttribArray) to legacy fixed-function vertex attributes (e.g. glVertexPointer/glEnableClientState), OR
  2. Write your own shaders that utilize the generic vertex attributes.

There is an option 3: You could do #1 and write your own shaders that use the legacy fixed-function vertex attributes, but that’s more work for you and has questionable benefit.