scrap opengl32.dll

The glRenderbufferStorage call you make whenever the user calls glWUpdateWindowbufferSize. That reallocates the buffer that effectively works as the window’s back-buffer. This will have to get called after every window resizing.

This one? This is hardly “always”. glWUpdateWindowbufferSize was intended to be used only on window resize which is rare event. What do you think happens right now when an opengl gets resized? Exactly the same, the driver reallocates it’s buffer because there is nothing better that can be done.

But anyway bear in mind what i do in this code is wrapping the current driver interface. If a driver implement it directly can do it otherwise.

Also bear in mind that this interface is only some example of what i think would be good. Of course if the ARB would like to standardize a new interface, i very much doubt it will be exactly this one. So dont focus too much on little details but try to see the overall idea.

Anyway i dont need to prove anything to YOU exactly. I dont for a moment doubt that you will dislike anything i propose. Your behavior of a forum troll is well known to me, i dont expect anything constructive from you and i dont care about your opinion at all.

here is a link to the sample i uploaded as it got buried under pointless posts
http://www.opengl.org/discussion_boards/…ilename=glw.zip

glWUpdateWindowbufferSize was intended to be used only on window resize which is rare event.

That depends on what application you’re writing, yes? I’m sure Blender3D windows are resized more frequently than, for example, full-screen videogames. And non-full-screen videogames often, as a matter of courtesy, allow themselves to be resized.

Not all applications work the same way.

What do you think happens right now when an opengl gets resized? Exactly the same, the driver reallocates it’s buffer because there is nothing better that can be done.

That’s up to the driver. It could allocate a bigger buffer internally than the current resolution. Indeed, it could allocate a buffer the size of the desktop, just in case. It could reserve the desktop resolution’s worth of space for the framebuffer; if the application starts to need to use that empty space (allocating lots of textures), then it can dip into it, but only as a last resort. It can play games with these things.

Yes, in the worst-case yes, every size change reallocates the buffer and fragments memory. But drivers have a lot of leeway in allocating things.

To be fair, if this window-buffer stuff was implemented by the driver, it could still do all of these things. Except that the user can’t. The user’s providing the back-buffer, which is where the problem comes from. Having to directly manage the back buffer makes it impossible to handle this.

I dont for a moment doubt that you will dislike anything i propose.

Let me tell you something about me. I do not generally notice who people are. I don’t really read names; I answer posts. I talk about what is said, not who said it. The only reasons I even know you from other posters on the forum are:

1: You consistently refuse to capitalize the word “I”.

2: You consistently try to make things personal when I’m talking about the merits of your idea. You’ve gotten it into your head that I’m out to get you.

I’m out to get ideas I think are bad or non-productive. If I seem to be “trolling” you, it is only because, from my perspective, you are consistently posting ideas that I find to be bad or non-productive. I’m not out to get you; I’m out to get bad ideas, and if you post a lot of them, we will talk frequently.

If you posted an idea I found to be good and there was actually a chance that the ARB would implement it, I wouldn’t argue against it. For example, this thread. The idea has actual merit, fills a need, and is something that the ARB might actually implement (unlike replacing OpenGL32.DLL, which the ARB can and will not do). You will notice that my contribution to that thread consisted primarily of asking a question about the frequency of version updates.

So no: I don’t dislike anything you propose. Just the bad or non-implementable stuff.

… Just the bad or non-implementable stuff.

ROFL. I sometimes wonder what criteria he uses to determine if something is not implementable or is bad…since it does not look like he implement GL for a living…[disclosure: I do not implement GL for a living either, I just use it for a living]

At any rate, just an FYI for people, the resize window bits is really small fries. Indeed, some libraries that do the cross platform thing for you of getting a window, making a context, do not trigger the actual window resize until after the user is done resizing… other apps may not do the resize at each resize even, only after several or after several frames… an operation done once in a blue moon, is not exactly anything to really freak on.

Bits that I really do not like on opengl32.dll are things like the craziness of making a GL context to just get a function pointer to make a GL context a different way (glX is guilty here too!). Other things I do not like: can’t make windowless and/or framebufferless contexts.

There is this great quote from an extension from NVIDIA written long ago:

that bit of wisdom beauty is from as long as ago as a decade (the first version of the spec was 2001).

So yeh, that just amazingly sucks, there is no spec for wgl, it is a Microsoft API, etc, etc.

For me all I really want is cross-process support for GL data and something like a pixmap and a native API to say “present this pixmap” for window contents.

oh well… EGL did not turn out well in my eyes at all…

You talk as if you are someone whose council is highly prized and sought after by the ARB. Don’t be silly dude. The fact that you flood this forum (half of the total posts are from you) and keep pestering the people does not mean anyone actually gives a damn about your opinion - unlike me most people just ignore you. In fact, from your posts, your lack of intuition about the matters it is quite apparent to me. It surely is so for anyone who would bother to read your posts careful enough. That makes it impossible that ARB would have anything to do with you.
I am sorry for getting personal, but still im a human being, not a cold machine and so im susceptible to anger and get annoyed sometimes. And you are doing your best efforts to annoy people.

I wish the moderators could somehow limit your trolling activity in this forum. Unfortunately you are doing it too skillful. You are very careful not to break the formal rules, yet you manage to do the job brilliantly.

Now that i think of it it really looks like you are doing some payed job, your unmatched activity in the forum for years and years. Whenever anyone comes up with some good idea you are always very fast to trash it at all cost, using absurd arguments when you cant find any good. Like someone is paying you to spoil the opengl community.

I wish the moderators could somehow limit your trolling activity in this forum. Unfortunately you are doing it too skillful. You are very careful not to break the formal rules, yet you manage to do the job brilliantly.

…or just a “filter Alfhonse” button.

Whenever anyone comes up with some good idea you are always very fast to trash it at all cost, using absurd arguments when you cant find any good. Like someone is paying you to spoil the opengl community.

OUCH. Once in a blue moon, his insistence on tearing something down finds an issue… which is like 99.99999% time correctable. There have been times when he pissed me off too. But oh well, such is life. Though in hind sight, it can be funny to be read… like this one: kRogue loves GL_NV_shader_buffer_load.

What does concern me is that it is possible that good ideas and comments get ignored because of his excessive postings…

This is exactly my concern too, nothing else.
And that is what i came to suspect to be his ultimate goal.

Very often his “arguments” are so absurd that i cant believe he really thinks he is right. Then i wonder what his motivation might be?

I agree on the need for trashing the opengl32.dll along with the antique wgl and glX. Time to have a one and only one API for window and context management. For now let the GL SDK provides only the unified context/windowing API, which is layered on the IHVs implementation.

OpenGL rocks and is a five star API, working with it is amazing vs. other headache APIs that bring projects down to dead end. Lets give it a gift for the Christmas and make it more wonderful!

There was mention of a desktop EGL library. However I haven’t seen it mentioned since.

Regards
elFarto

I prefer something like EGL because it exists on all the embedded systems. Why do they even call it egl? Just convert those function to gl functions and put it in the same specification.

I’m not sure why glW is being proposed in this thread. Please make OpenGL cross platform and get rid of wgl/glX/agl and all that craziness.

That would be best but first we will have to get rid of all the different OS-es plus their different window-systems and leave only one :slight_smile:
For OS i would choose unix, but as for the windowing-system i don’t know. I don’t like the X-winows very much, especially it’s network transparency which has no meaningful use and only adds overhead.

I’m not sure why glW is being proposed in this thread. Please make OpenGL cross platform and get rid of wgl/glX/agl and all that craziness.

I am not at all sure that a “cross-platform” thingamajig for glX/wgl/egl that makes users happy is really possible. Some will jump up and down and say that EGL is just that, except that I am not happy with it :whistle: Issues such as “window handle”, “display handle”, “pixmap” crop up (in EGL it is left as a typedef in EGL.h dependent via #ifdef on the platform) and EGL is strongly married to essentially double vs single buffered surfaces which are not always the right thing to use. Additional ugly issues pop up all the freaking time in integration with compositing window managers, etc. Lastly, a generic API is likely to suffer from these bits:

  • [li] Shoots for the lowest common denominator[*] Convoluted bits to be cross platform API where certain concepts either do not make sense on some platforms, or certain concepts are very painful to expose in a cross platform API

On a quasi-related note, oh yes, X11 just sucks.

The concept is usually simple: give me a surface to which to render, give me a context with which to render. The latter concept is pretty cross platform, but the former is ugly as it can be tightly coupled to the concepts of the system and hardware (for example the bit-depth pain of X11-visuals to bits of an EGLSurface for example), fullscreen vs windowed and other bits tied to the hardware of a platform. Sometimes it can be show-horned, but often not…the whole thing can get wicked hairy. Then the sicked hairy is cruel in that the context possible for a surface may depend on the surface…

by the way my fictional “glW” interface partially solves the cross-platform issue in that at least the context creation is not tied to the concrete windowing system. And if you only need the gl context for offscreen rendering, you can keep total platform independence.
Its only when you need to show something to a “window”, you will have to know about the OS/window-system you are on.

In that sense the functions CTX glWCreateContext(int device, CTX share, int *attribs), glWMakeCurrent(CTX ctx), … could just be named glCreateContext, glMakeCurrent, … and the opengl’s platform independence will increase considerably.