LINUX - not so fast not so reliable

Did anyone something with opengl under linux? all the examples uses QT library forms. That sux!!! And SDL sux too! I got a reasons to say so, beleave me. There is GL.H, GLU.H and GLEXT.H in /include directories. How can I get work sigle C++ file without any widgets that looks like this?

#include <gl.h>
#include <glu.h>

void Render(void)
{
glFlush();
glClear(GL_COLOR_BUFFER_BIT || GL_DEPTH_BUFFER_BIT);

glFinish();
}

int main(void)
{
//…some initialissation…

while (!exiting)
{
Render();
}

return 0;
}

The simplest way to write OpenGL programs under linux is to use GLUT .

Many of the examples use it because of this, although it’s less useful for real applications.

Note that OpenGL header files are held in a subdirectory so it should be:

#include <GL/gl.h>

to include the gl.h header file.

dave j

I wouldn’t recommend using QT for OpenGL rendering. It is only useful when you have a GUI with embedded 3D rendering. I don’t know what problems you have with SDL, but you don’t have to use it if you don’t like it.

What do you use on Windows? If you are used to directly writing applications with windows system calls and wgl, you won’t really have a problem to learn X11 and glX, it is not that different. It is clearly the best choice if you don’t care about platform independence, because you are not limited by the capabilities of an abstraction library like SDL or GLUT.

GLUT is very simple to use, and it works the same on Windows and Linux, but you will reach its limits very soon.

About speed and reliability: If you have got a speed problem, it is propably because you don’t have the correct graphic driver installed. And I never had any problems with reliability on Linux, but if you have a real problem aside from “it sux”, feel free to ask.

While I also raccomand you GLUT, I must say that programming in linux is a real pain… I write this in the hope someone will understand that unless introduced, many things on linux are not really “productive”.
For simple apps, linux is quite enough but when it comes to large scale projects, things are going to be really slow (in development time terms) unless someone in the team already knows the stuff AND you have people and effort to spend.
As a side note, after lot of wasted efforts, I’m seriously evaluating dropping linux (the actual linux codebase does not work and it’s 1 year late when compared to win32 functionalities). As a result, I’m also considering dropping GL and switch to another API…
I am writing this in the hope a day some open-source developer will actually realize that “hey, instead of putting IPtables or crypt on kernel, maybe someone will be happy to see we put in useful multimedia functionalities”.

I think it’s not because you fail to develop under some OSes that those OSes are de facto bad. I think you are simply attacking people involved in such OSes might they be kernel programmer, Mesa or simply user-programmers.
I personally increased my developpment quickness under this OS but not under the more popular. Facts are there: you need to understand what programmation is under some OSes, whereas this is not an obligation under other OSes.

Productiveness is not really the first intends of free softwares whereas there are many, many programs. But productivity is NOT a problem here, just see about recent games (ut, doom). Does those companies had to report their games because they’d like another OS compatibility ?

Then, your explanation is not an explanation. Do you mean because we use free softwares, we know nothing, and just a little minority are gurus enough to know without knowing how ?
Of course you have to know the stuff you are doing. And there’s no OSes that are good enough to let you make things you don’t know.

Now, that saying iptable or encryption is useless makes me wonder who you are. Do you know computers ? Surely not. If you don’t want those functionalities just don’t use them inside your kernel. Now, maybe you never went to the stage of compiling kernels…

Saying a kernel is a multimedia core but not a security core is purely non-sense. But maybe the kernel should implement all the X/GLX layer too ?

Again, here it’s an OpenGL forum, not an association for hearing your laments, neither for fighting for an OS or just for another.
If you want to speak in terms of programming, it’s even not the best place, but I’ll accept to talk about that with you, if you dare.

You’re lucky this is a beginner forum.

For other peoples: excuse me for this rage. I never have done that here, I don’t like, but I was frustrating to read that.

Hope this will helps some people.

> Linux is not user-friendly.
It is user-friendly. It is not ignorant-friendly and idiot-friendly.

Sorry, couldn’t resist :stuck_out_tongue: :smiley:

It’s the truth though.

People tend to blame everything/everyone for there own shortcomings, that’s live I guess.

back to the question…try this:

#include<X11/Xlib.h>
#include<GL/glx.h>
#include<GL/glu.h>

Display			*dpy 	= XOpenDisplay(NULL);
Window			root 	= DefaultRootWindow(dpy);
GLint                   att[] 	= {GLX_RGBA, GLX_DOUBLEBUFFER, True, GLX_DEPTH_SIZE, 24, None};
XVisualInfo             *vi   	= glXChooseVisual(dpy, 0, att);
GLXContext              glc   	= glXCreateContext(dpy, vi, NULL, False);
Visual                  *vis  	= DefaultVisual(dpy, 0);
Colormap                cmap  	= XCreateColormap(dpy, root, vis, AllocNone);
unsigned int		w 	= XDisplayWidth(dpy, 0) / 2;
unsigned int		h 	= XDisplayHeight(dpy, 0) / 2;
int                     dep   	= DefaultDepth(dpy, 0);
int                     cmask 	= CWColormap | CWBorderPixel | CWEventMask;
int                     emask 	= ExposureMask;
XEvent			xev;
XSetWindowAttributes	swa;
XWindowAttributes	gwa;
Window			win;

int main(int argc, char *argv[]){

 swa.colormap     	= cmap;
 swa.border_pixel 	= 0;
 swa.event_mask   	= emask;
 win              	= XCreateWindow(dpy, root, 0, 0, 400, 400, 0, dep, InputOutput, vis, cmask, &swa);
 XStoreName(dpy, win, "SIMPLE QUAD");
 XMapWindow(dpy, win);

 glXMakeCurrent(dpy, win, glc);
 glClearColor(0.00, 0.00, 0.60, 1.00);

 glMatrixMode(GL_PROJECTION);
 glLoadIdentity();
 glOrtho(-1., 1., -1., 1., 1., 100.);

 glMatrixMode(GL_MODELVIEW);
 glLoadIdentity();
 gluLookAt(0., 0., 10., 0., 0., 0., 0., 1., 0.);

 while(1) {
	XNextEvent(dpy, &xev);
	
	if(xev.type == Expose) {

		XGetWindowAttributes(dpy, win, &gwa);
		w = gwa.width;
		h = gwa.height;

 		glViewport(0, 0, w, h);
		glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

		glBegin(GL_QUADS);
		 glColor3f(1., 0., 0.); glVertex3f(-.75, -.75, 0.);
		 glColor3f(0., 1., 0.); glVertex3f( .75, -.75, 0.);
		 glColor3f(0., 0., 1.); glVertex3f( .75,  .75, 0.);
		 glColor3f(1., 1., 0.); glVertex3f(-.75,  .75, 0.);
		glEnd();

		glXSwapBuffers(dpy, win); } } }

compile with

gcc -I/usr/X11R6/include/ -L/usr/X11R6/lib -o Quad Quad.cc -lX11 -lGL -lGLU