Display lists in 3.1

imho drawing text has become easier and more efficient. You just ask GDI or whatever-in-Linux to make a bitmap of selected chars or whole lines of text (preferably), put that in texture atlases and draw away. There’s enough RAM and VRAM to waste nowadays.
3D extruded text is a rarely used feature, but made easy with existing libs afaik.

FWIW I agree with you.

<u>DISPLAY LISTS</u>
We still use these extensivly, mainly to simulate the proposed OGL3 features that were supposed to replace them, but which still have not turned up. (Such as lpDrawArrays and program objects).
I really liked the “Enhanced display lists” described at Siggraph asia 2008 and hope NVIDIA continue developing these ideas.
We could especially use an automatic ‘background’ display-list that resumes when Swapbuffers is executed and is suspended when the VSYNC occurs.

<u>QUADS</u>
We will be using tesselation of low-resolution meshes (made of quad patches) using AMD_vertex_shader_tessellator, with an OpenCL or GS tesselator for NVIDIA (until they release a real tesselator).
Quads are necessary for subdivision surfaces such as Catmull-Clark.

<u>DEPRECATION</u>
There is an awful lot on the deprecation list, though, that nobody should be using in any modern program.
Yes, all the old features need to be in the driver so that programs written for 3.0 and earlier will still run, and i can understand that companies with limited resources may want to add a new feature to a very old engine and you need to support this.
The only thing i would object to is if this:

  1. Uses a sizable slab of extra memory (on a 32-bit machine).
  2. Slows my program down, eg. by forcing unnecisary hash-table lookups of buffer names because i cant tell it that i am always going to use GenBuffers.
  3. Makes a new extension more complex because it has to work around a conflict with one of the old functions.

If it is indeed true that GL_ARB_Compatibiliy has <u>no</u> performance impact on our program then i dont care that its there.

However, when i specifically ask for a 3.1 context i am specifically asking for a context optimised for modern features, but a driver with GL_ARB_Compatibiliy simply ignores what i have asked for.
A 3.0 context could have extensions for every new feature in 3.1, so if 3.1 has GL_ARB_Compatibiliy then the 3.0 and 3.1 context would be identical.
Future OpenGL additions can simply be added to 3.0 (and 2.1) as extensions, leaving 3.1 for those that DONT want compatability.

NVIDIA adds special optimisations to its drivers for specific game engines by detecting the name of the executable, but this does not apply to the smaller developers.
All i really want is a way to tell the driver that my program is well behaved (always uses GenBuffers, uses properly structured display lists etc.) and will not use any obsolete functions.
Hence GL_ARB_Compatibiliy should only be supplied if the program <u>asks for it</u>.
OpenGL3 already contains mechanisms capable of exactly this purpose, having separate profiles for ‘Compatability’ and ‘Performance’ would allow a driver to optimise itself for each case (without <u>requiring</u> it to do so).
OR add a WGL_CONTEXT_BACKWARD_COMPATIBLE_BIT_ARB to the WGL_CONTEXT_FLAGS.

Deprecated features that are still needed, such as quads and an enhanced version of display lists could be added back to the performance profile with specific extensions.
But the really bad stuff like application generated names or display lists that are allowed to contain a glBegin with no glEnd, just has to go.

Ha Ha. It’s certainly better than his earlier one anyway! :wink:
Not that I agree with all he says in that either though.

I read these posts just today.
My opinion:
OpenGL has failed in some respects while Direct3D has done things right. Direct3D aimed at games. D3D used COM which helped to clearly demark version differences. D3D is a HAL. Games + Direct3D + Microsoft are rulling the 3D scene.
Tools have gravitated to D3D (3D studio, Maya, and who knows what) due to games.

The fact that nVidia has already written good code should not effect GL’s design and future.
It’s better to have a lightweight new API and layer GL on top of it. What the heck was wrong with Long Peaks???

like Ive asked before
Whats stopping nvidia/ati etc releasing opengl es drivers for the various OSs?
Ive got no interest in opengl3.0 but would perhaps switch over to opengl es

Interesting, and I also kind of agree with V-man’s comment too. But…
It seems he is suggesting basically GL3.x with no deprecation support, and then anything a specific vendor wants to add in layered over the top, right?

The problem I see with that is the problem we have now. Certain vendors offering certain things, and there being the same crunch when you need to support more than one subset of HW.

Also, out of interest which version of ES would you want?

I have really enjoyed the relative confinement of working with GLES1.1 these last few months (and finally was forced to grasp the finer points of FF GL_TEX_ENV, multi-texturing etc.), but have started to miss shaders a lot…
Looking forward I am really excited about working with GLES2.0.

So in that sense GLES2 as a base API is an interesting proposition.

Just a random thought: EGL will have to emerge somehow. Microsoft will probably not make that dll, however easy it is. Vendors generally don’t go providing a naked dll with their drivers, though cg.dll is a step in that direction. But so far they’ve been providing the direct .so files .
Another thought: extra licensing fees; Khronos viewing GL and GLES as separate.
</naive thoughts>

ok you dontwant to follow the latest fads whatever but

I believe opengl es devices are about to take off in a huge way (theres already ~40million iphone/pod ogl es devices out there already plus android etc so its not exactly not used)
but apple have just announcged the new ogl es 2.0 (thus supports shaders unlike the previous iphones)
but the thing is the cheapest model will be priced at $99!! (hell even me whos no fan of apple will get one at that price), apple have also ordered for 100million 8gb chips so they believe theyre gonna sell a few.

Vendors generally don’t go providing a naked dll with their drivers,
explain naked, all hardware I have seems to come with various dll’s from the manufactures

I simply haven’t seen something like wsock32.dll being overwritten/supplied by hw vendors.

That’s $99 when you subscribe to a $100/mo plan for 2 years. And around $600 to get it from eBay after the price hiking ends (in about 2 years, as that happened so late with the first iPhone).

“Pandora” has ES2.0 support.

GLES on iPhone is fun; on Symbian it might be after you overcome the OS fluff and have your game playable on a tonne of screen-sizes;
GLES on Android and other Java phones is PITA (on PC we bitch when a call takes 2k+ cpu cycles, imagine what it is when it takes 650k cycles). Java unfortunately stays strong on mobile devices, even if cpus have MMUs for a decade already.

GLES2.0 is only on the new iPhone anyway AFAIK.
ducks swat from NDA gods

And as has been said they are really starting from $99+++++++++++.

However, if you want to learn GLES, and / or GLES2.0 you can download the DevKit from Apple for free as long as you register. The only restriction is putting SW onto your device. The Simulator has always supported GLES and I don’t see any reason why it won’t have a full GLES2.0 implementation on it.

Great place to start though.

But then again… http://Beagleboard.org/

But why? You can learn GLES2.0 with AMD OpenGL ES 2.0 Emulator
http://developer.amd.com/gpu/opengl/Pages/default.aspx

  • Support for core OpenGL ES 2.0 functionality
  • Support for many important OpenGL ES 2.0 extensions
  • Support for EGL 1.3

The OpenGL ES Emulator Control Panel enables control of many emulator options including:

  • Modifying the screen size
  • Modifying the available GPU memory
  • Performance throttling
  • Sending debugging output to files
  • Viewing debugging output on the screen

Just try it!

Great. Another option.

I wasn’t saying people had to use the Apple Simulator. Just that it’s an option.

At the end of the day you really want to see what happens on real hardware.
So getting a device of some sort that matches your target audience is going to be the best option.

A configurable simulator sounds as dangerous as the one major pitfall with Apple’s Simulator. The fact that performance in your emulation environment and your target platform is never the same.

Identical or even comparable performance between simulator and device has never been available anywhere imho, unless you emulate a device with 1000+ times less powerful hardware, i.e a NES on a c2d. And has not been among the major hurdles. Just having one screen-size, same input methods and newer models not being slower are extremely huge unseen-before bonuses in the mobile gamedev, in my experience.

Even I can see the value of fancy graphics on a phone, if say due to circumstances beyond your control you’re forced to burn a few hours in a broom closet, with only the items on your person.

Assuming you were in a broom closet “due to circumstances beyond your control” and had “to burn a few hours”, couldn’t we assume that “the items on your person” might not include a phone? Or anything at all?

Then again, i don’t know what YOU usually do in a broom closet…

I’d rather not say what I usually do in a broom closet, but I can assure you it’s perfectly legal.

and so, as the soft breeze gently nudges the sun over the horizon, another potentially interesting thread dissolves into bollocks.

Just a bit. :slight_smile:

I totally agree with Mark Kilgard regarding the OpenGL compatibility issue. There is really no reason to deprecate old and working OpenGL features for application programmers. I can understand that OpenGL driver Engineers would like this because it would make their jobs easier. I can also understand why some game programmers would like this because it would be less confusing. But there are so many CAD, scientific and engineering software that use the non-core features in the market today. These software actually make real differences in improving human lives as opposed to just have funs in computer games. It is simply a ridiculous proposition that we would require civil engineers, mechnical engineers, aerospace engineers etc. to not use modern graphics card in order to use their existing software. It is equally ridiculous to require them to purchase new modern graphics hardware in order to use new software that are written with new OpenGL core features only.