# What draw type do I use for Delaunay triangulation?

I recently decided to try and implement a Fibonacci sphere into my code. I first implemented a function to actually generate the points of a Fibonacci sphere, and then the Bowyer-Watson algorithm to create Delaunay triangulations of the Fibonacci sphere points, however, when I try to draw this using any of the main glDrawArrays types, it doesn’t draw as expected, there’s usually either too many or too little triangles, and in places it doesn’t need to be, so I’m lost as to which draw type I use to display this data, or if I have to do something else to get anything to display.

Sounds like your first task should be to determine whether the error is due to:

• Incorrect triangulation
• Incorrect drawing of this triangulation with OpenGL.

Show some code including how you’re drawing the triangle mesh, creating the buffer objects, and uploading vertex and index data to the buffers. You’re using indexed triangles, right?

Also, simplify your code to something very basic. Draw a cube with triangles. That may very well point out what you’re doing wrong.

I doubt it’s being caused by incorrect triangulation, as I print out the coordinates of the triangulation locations, but with OpenGL that probably is the case. For starters, I plot points of the sphere using the fibonacci sphere algorithm

``````void create_fibonacci_sphere(int radius, int num_of_points, float points[][3], size_t arr_size)
{
// step 1: using the golden ratio and angle increments, plot out the points of the sphere
double golden_ratio = (1 + sqrt(5)) / 2;
double angle_increment = 2 * M_PI * golden_ratio;

for (int i = 0; i < num_of_points; ++i)
{
double t = (double) i / num_of_points;
double angle1 = acos(1 - 2 * t);
double angle2 = angle_increment * i;

points[i][0] = radius * sin(angle1) * cos(angle2);
points[i][1] = radius * sin(angle1) * sin(angle2);
}

}
``````

I then use my triangulation code to call some functions that pass the locations of the points and spit out triangles, which already has been debugged. And then I upload the vertices normally

``````    int radius = 50;
int num_of_points = 100;
float vertices[num_of_points][3];

unsigned int vbo, vao;
glGenVertexArrays(1, &vao);
glGenBuffers(1, &vbo);

glBindVertexArray(vao);

// upload vertex data to gpu
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices) * sizeof(double), &vertices[0], GL_STATIC_DRAW);

// position attribute
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)0);
glEnableVertexAttribArray(0);

// normal attribute
glVertexAttribPointer(1, 3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)(3 * sizeof(float)));
glEnableVertexAttribArray(1);

``````

As well as my draw call

``````    glDrawArrays(GL_TRIANGLES, 0, num_of_points);
``````

Which leads to this (taking in wireframe)

Pretty sure OpenGL is taking it upon itself to triangulate the points for me which is not what I want because I already did that using Delaunay triangulation.

OpenGL doesn’t do anything by itself; you tell it what to do. That is, you must tell OpenGL how to build triangles out of these points. Shoving a random assortment of point into a buffer and saying “draw that” isn’t going to cut it.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.