Dbe and glx, same machine different processes, affecting each other

i have recently been developing application in linux, specifically a imageviewer application, and since all i had to do was blit a 2d image to the screen i programmed it entirely in xlib using the dbe and mit-shm extension. it works very well. Now the reason for this app was in a rendering cluster, machines(including the same one as the displaying machine) would volume render a piece of the total image and send it on to the machine that was the display. The rendering code utilizes opengl and glx. Now if I run the display on a different machine without a rendering code running, the imageviewer app works very well. The problems comes about when I have an instance of the rendering code up, my imageviewer app is no longer double buffered, but appears to be copied into the frame buffer. I have tried many ideas to correct this, as I thought that maybe the rendering code was using to much card memory and booting imageviewer out of the card. Now I did not write the rendering code, but the developer insists that he is reducing the memory footprint of this app, but the problem persists. I would very much like to hear any suggestions or insight into my problem. Thanks in advance.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.