Help for OpenGL 4

my club is currently developing an opensource flight simulator in OpenGL 2 and C++.

Given the number of graphic limitations of this version, the development team has decided to update progressively the project to allow it to run with the OpenGL 4 API.

Recently, the developers told me that given the lack of workforce, the upgrade to OpenGL 4 risks to take time.
My question is the following: is someone ready on this forum to help us, under any form, to convert this project in OpenGL 4 ?
Thank you.

Why can’t you complete the project using the compatability profile?
That way you don’t have to recode existing functions and you can still use GL 4 funtionality. Also why GL 4 specifically? Tesselation a factor for you? It does rather limit your audience.

Opengl 4.0 seems like a fairly insane choice. You limit yourself to top end cards only. Even my nvidia 9800 card only supports opengl 3.0 …

You limit yourself to top end cards only.

You limit yourself to recent cards only. There are plenty of cheap graphics cards that support 4.0. You can get motherboard-embedded GPUs nowadays that support GL 4.0. AMD is already shipping AMD Fusion CPUs that have 4.0-capable GPUs.

That being said, there’s not a huge difference between 3.x and 4.x, for most normal uses. The biggest obvious feature is tessellation, which might be of value in a flight simulator. The next biggest feature is shader_image_load_store, which has a variety of uses, but is really not for someone coming from 2.1 land.

Also, BionicBytes is correct in that you’re limiting quite a bit of your audience. Using the Steam Hardware Survey as an example, the number of users with 4.0 capable hardware is much smaller than those with 3.3 capable hardware.

Intel has 50% of the market share, and they don’t have any cards that support Opengl 3 yet ? We get a lot of customers with intel cards, and it’s a headache for me because their drivers are so god damn buggy.

Now that Intel has released their Sandy Bridge Xeons with the new* P3000 professional graphics, perhaps they will put more effort into their OpenGL implementation.

Have you attempted to contact Intel about issues with their driver? We’ve had some success getting a few things fixed by raising issues with them.

    • not new hardware, just a different driver.[/b]

How does one complain about Intel drivers? The problems I had with intel were so bad that I had to code some pretty obscene work arounds just to be able to get usable output at all from their drivers.

Final output looked something like this on Intel

Where as it should look something like

Yeah, that’s not too far off some of the issues I’ve seen with trying to something moderately complex on Intel graphics. Complete freak out :slight_smile: I usually have to kindly deny the Intel OpenGL driver entry to those codepaths and disable the associated feature(s).

Generally, reporting a driver issue involves simplifying the offending code until the problem area can be isolated. I usually strip everything out but the absolute basics and then gradually add them back in. It’s pretty tedious.

Intel does have a forum for graphics software development:

You could try taking it up with them there. Sadly, drivers don’t seem to get better on their own.

We are not interested specifically in the 4.0 version as such. What we want is to use shaders rather than the fixed pipeline functionality, because we will be able to provide better graphic like per pixel lighing for instance.
So may be we can use shader from older version, but we think that OpenGL 4.0 will become rather commun in one or two years from now.

In that case, look into OpenGL 3.3. It’s supported on all DirectX 10 capable hardware which has been around for about 6 years now (ATI HD 2000 series and up, Nvidia GEForce 8000 series and up). Once OpenGL 4 (DX11) hardware becomes more commonplace, you’ll be able to add GL4 features to your GL3.3 engine without much trouble.

Besides tessellation, OpenGL 4.x gives you more flexibility when writing shaders (subroutines, sampler & uniform block array access). However, it’s something that is easily worked around; I wouldn’t restrict your audience just for those.

Some people have reported that they couldn’t update their laptop drivers so they are stuck as GL 3.2, so you may want to scale back from 3.3 to 3.2.

Personally, I don’t have any problems installing the nVidia driver for my Dell laptop.

Requiring a specific core version is no longer important. New features have been made available to older GL versions in form of strict subset ARB extensions:

  • Vertex buffer object: Core in 1.5, extension GL_ARB_vertex_buffer_object
  • Shaders: Core in 2.0, extension GL_ARB_shading_language_100
  • Framebuffer object: Core in 3.0, strict subset extension GL_ARB_framebuffer_object
  • Vertex array object: Core in 3.0, strict subset extension GL_ARB_vertex_array_object, in 3.1 and after forward compatible and core profile contexts require use of vertex array object
  • Draw elements base vertex: Core in 3.2, strict subset extension GL_ARB_draw_elements_base_vertex
  • Core geometry shaders: Core in 3.2
  • Extension geometry shaders: extension GL_ARB_geometry_shader4 (different from core geometry shaders)
  • Program binaries: Core in 4.1, strict subset? extension GL_ARB_get_program_binary

If you look at the list above (I hope I got it mostly correct!) - then you may see that GL 2 + a selection of strict subset ARB extensions gives you a fairly nice feature set. The code using these will work equally well on GL 2 with extensions and GL 4.

Requiring a specific core version is no longer important.

This much is true, but not for the reasons you state. OpenGL context creation is allowed to give you any context that is 100% compatible with the one you request. If you ask for a 2.1 context (which you implicitly do if you use the old context creation rules), you are allowed to get 4.1 compatibility, because it is 100% backwards compatible with 2.1. Similarly, if you ask for 3.3 core, you can get 4.1 core since it is 100% backwards compatible with 3.3 core.

The code using these will work equally well on GL 2 with extensions and GL 4.

Well, except for geometry shaders, which are completely different. And you won’t have access to any of the integral texture formats. Or integral vertex formats (glVertexAttribIPointer). Or many other features.