Double Buffering

We are running Centos Linux with NVIDIA video card and openGL. My application is to create a 2D strip chart. I use Qt for the user input and control and created a widget of type QGLWidget for the strip chart itself.
We do not have GLUT or BOOST. The command: glxinfo | grep version returns, in part: OpenGL version string: 2.1 Mesa 17.2.3. I have the red book: Open GL Programming Guide. It states that double buffer is good but OpenGL does not directly provide it. I am a government contractor and pretty much all sites with a topic of gaming are blocked.

My personal limitations are effectively no knowledge of graphics and matrix math. But this is 2D only and will never need transformations or anything of the sort. Given all that:

Can anyone tell me how to initiate and execute double buffering? A link to a tutorial will be great.

Qt uses double-buffering automatically. Contexts are created with double-buffering enabled by default, and the swapBuffers() method is called after paintGL() returns.

More details: whether a context is double-buffered is determined by the QGLFormat passed to the QGLContext constructor. The default format, used by the no-argument QGLContext constructor (which in turn is used by the QGLWidget constructor which doesn’t take a format or context argument) has double-buffering enabled. The state can be queried via QGLFormat::doubleBuffer() and changed (prior to context creation) by QGLFormat::setDoubleBuffer(). Some platforms (e.g. EGL) only support double-buffered contexts.

I cannot copy paste from my work computer to here but am trying to show all the relevant details. Please try to accommodate typos.
My paintGL() looks fundamentally like this

void C_GL_Strip_Chart::paintGL()
   save_current_time()  // gets and saves the system current time in nanoseconds
   if( ! paused and new_point_added )

That does not include performance monitoring that shows it is called at 60 Hz. paintGL() is not explicitly called anywhere in my app. The display looks as though it is getting painted with two screens: the expected points and a blank screen, alternating between the two. A very fast flicker.

{  // showing the essence only
glMatrixMode( GL_MODELVIEW );
glClear( GL_COLOR_BUFFER_BIT );  // white background
glColor3f( 0.0, 0.0, 0.0 );
glBegin( GL_LINE_STRIP );
for( i = oldest, i <= newest; i ++ )
    glVertex2dv( gl_vector[i].p);  // circular buffer, but all that code is omitted.
glMatrixMode( GL_PROJECTION )

gl_vector is declared in this manner:

struct gl_point {  GLdouble p[2];   };  // p[0] = x which is time, p[1] = the Y value plotted.
std::vector< gl_point > gl_vector;

The strip chart does show all the expected points and does scroll as desired. But it flickers badly.

In my experience, flicker is usually caused by Qt painting the window background prior to OpenGL rendering.

You may want to experiment with the WA_NativeWindow, WA_NoSystemBackground and WA_OpaquePaintEvent attributes (QWidget::setAttribute) and the QWidget::setAutoFillBackground() method. Apparently, style sheets can also be a factor.

If the QGLWidget is a child of some other widget, the issue may be caused by painting the background of the parent widget(s).

I will look up those references and see what effect then might cause.

Yes, the QGLWidget is a child of the main Qt widget. It is displayed with labels and line edits around it to provide the user with control. I don’t explicitly call any functions to repaint the containing Qt widget, but some of the Qt fields are updated constantly.

As a self check, I understand your reply as saying the problem might be, or maybe include, for lack of the right term, excessive re-painting. Unless you correct me, I will presume that and see what I can discover.
Thanks for your reply.

[QUOTE=GClements;398178]In my experience, flicker is usually caused by Qt painting the window background prior to OpenGL rendering.

Hello GClemets, The more I look at this the more you answer looks exactly right. It looks like alternating displays between a blank screen and the plotted data.
I did look up those WA functions. They were introduces in version 4 and we are stuck with version 3.

It may be interesting to note that when I start the app nothing happens. When I start a timer that adds points at the timer rate and calls update, then the chart starts updating. That timer will not go any faster than 20 Hz. But when it runs and tickles the system with the regular update() calls, then paintGL() gets called at 60Hz. I don’t understand that relationship.

Edit: I just thought of another way to state this. It looks a bit like it might be double buffering but my code is only writing to one of the buffers rather than swapping them. The uncertainty in my statement is intentional. If so, I don’t know how to access the other buffer.

All that said: This version 3 is a major problem. I have worn out my welcome matt commenting on that need. A large part of the problem is that most of the Qt people have moved on and the update is expected to take a labor year of time. I am new at both Qt and Open GL.

If you have more to write that will be appreciated. If not, then you have been a help already and I thank you.

With Qt4, the swapBuffers() method is called automatically when paintGL() returns. I don’t have any experience with Qt3, so I can’t say if the same is true there. If the context is double-buffered and your rendering appears on the screen (at all), then swapBuffers() is getting called somewhere.

Interesting discovery. Tried this to explore a bit. Added to the end QGLWidget constructor:

bool temp = doubleBuffer();
std::cout << "
double buffer is " << temp << std::flush; // it is true, OK, change to see what happens

setAutoBufferSwap( false );

temp = doubleBuffer();
std::cout << "
double set to " << temp << std::flush; // still true, not expected

Do knowledgeable users see this as an expected result?

Yes. Whether a context is double-buffered is fixed when the context is created and cannot be changed afterwards. setAutoBufferSwap() controls whether swapBuffers() is called automatically after paintGL(). It is enabled by default, and can be queried with autoBufferSwap().

I have apparently solved the flicker problem. update() was removed from everywhere except paintGL(). That one does need to be tickled on a regular basis. I suspect that having a call to update() within paintGL() causes it to be scheduled to be run again asap. When that update() is ticked by a Qt interval timer, paintGL() runs at 60 Hz. When that timer is stopped, the display stops. Perfect for this stripchart application. I do not completely understand how this works. Maybe as I continue working this the light will brighten.

I created an update_glortho() which makes a system call to get the current clock time and calls GLortho(), the openGL function that sets all the clipping. That controls the vertical scaling and moving the chart according to time. Using that means the app does not do any explicit scaling. It appears that Qt, or mabe OpenGL is handling the double buffering behind the scenes.

Now I know a bit more about graphics and quite a bit more about how versatile Qt and OpenGL are.
Thank you for your time and patience.