OpenGL 2.0 vs GL_ARB_vertex_program

I’m confused about the real differences, if any, between the shader APIs integrated into the GL standard as of GL 2.0, and the shader API in the GL_ARB_vertex_program extension (and friends).

I take it that there are machines out there that might, for example, support glLinkProgramARB, but which might not yet support glLinkProgram. So what’s the benefit of using the non-ARB versions? Or is there none.

-Patrick

Wait, GL_ARB_vertex_program extension is not GLSL, and doesn’t use glLinkProgram* .

The “arb” suffix is removed when the function is accepted into a core version. And in maybe all cases there aren’t any differences between the ARB and core version.
Usually there are differences only when an EXT becomes ARB.

Ah, oops, let’s say GL_ARB_shader_objects then, which does have glLinkProgramARB.

But that answers it, interesting; I wonder why they even bother moving things into core then. That’s confusing, because I can’t imagine why would anybody ever want to loose install base by using the core version of some functionality instead of the extension based version.

Because when you see that a driver supports version X, you know exactly what minimum subset of extensions/capability you can use.
Not everyone is Blizzard or IDTech with heaps of time and will to support all HW since ancient times, so they have to draw the line somewhere - at a specific core version.

And in maybe all cases there aren’t any differences between the ARB and core version.

Except that there are differences between them. They don’t even use the same object type; the ARB extension uses GLhandleARB, while the core version uses the traditional GLuint.

And that doesn’t even begin to count the changes from ARB_geometry_shader4 to GL 3.2.

Interesting, so they’re not the same. That makes a lot more sense then.

I suppose I’ll adopt the policy of using core versions of things then. But just out of curiosity, could someone name at least just one difference between ARB_geometry_shader4 and GL 3.2 off the top of their heads (other than the names of the types)? It’d be nice to have something concrete to think on.

I quickly compiled a list of funcs that are different in arguments, using the newest glext.h and some recent version of wglext.h :


glGetProgramiv {
	void APIENTRY glGetProgramivARB (GLenum target, GLenum pname, GLint *params);
	void APIENTRY glGetProgramiv (GLuint program, GLenum pname, GLint *params);
}
glBufferData {
	void APIENTRY glBufferDataARB (GLenum target, GLsizeiptrARB size, const GLvoid *data, GLenum usage);
	void APIENTRY glBufferData (GLenum target, GLsizeiptr size, const GLvoid *data, GLenum usage);
}
glBufferSubData {
	void APIENTRY glBufferSubDataARB (GLenum target, GLintptrARB offset, GLsizeiptrARB size, const GLvoid *data);
	void APIENTRY glBufferSubData (GLenum target, GLintptr offset, GLsizeiptr size, const GLvoid *data);
}
glGetBufferSubData {
	void APIENTRY glGetBufferSubDataARB (GLenum target, GLintptrARB offset, GLsizeiptrARB size, GLvoid *data);
	void APIENTRY glGetBufferSubData (GLenum target, GLintptr offset, GLsizeiptr size, GLvoid *data);
}
glShaderSource {
	void APIENTRY glShaderSourceARB (GLhandleARB shaderObj, GLsizei count, const GLcharARB* *string, const GLint *length);
	void APIENTRY glShaderSource (GLuint shader, GLsizei count, const GLchar* *string, const GLint *length);
}
glCompileShader {
	void APIENTRY glCompileShaderARB (GLhandleARB shaderObj);
	void APIENTRY glCompileShader (GLuint shader);
}
glLinkProgram {
	void APIENTRY glLinkProgramARB (GLhandleARB programObj);
	void APIENTRY glLinkProgram (GLuint program);
}
glValidateProgram {
	void APIENTRY glValidateProgramARB (GLhandleARB programObj);
	void APIENTRY glValidateProgram (GLuint program);
}
glGetUniformLocation {
	GLint APIENTRY glGetUniformLocationARB (GLhandleARB programObj, const GLcharARB *name);
	GLint APIENTRY glGetUniformLocation (GLuint program, const GLchar *name);
}
glGetActiveUniform {
	void APIENTRY glGetActiveUniformARB (GLhandleARB programObj, GLuint index, GLsizei maxLength, GLsizei *length, GLint *size, GLenum *type, GLcharARB *name);
	void APIENTRY glGetActiveUniform (GLuint program, GLuint index, GLsizei bufSize, GLsizei *length, GLint *size, GLenum *type, GLchar *name);
}
glGetUniformfv {
	void APIENTRY glGetUniformfvARB (GLhandleARB programObj, GLint location, GLfloat *params);
	void APIENTRY glGetUniformfv (GLuint program, GLint location, GLfloat *params);
}
glGetUniformiv {
	void APIENTRY glGetUniformivARB (GLhandleARB programObj, GLint location, GLint *params);
	void APIENTRY glGetUniformiv (GLuint program, GLint location, GLint *params);
}
glGetShaderSource {
	void APIENTRY glGetShaderSourceARB (GLhandleARB obj, GLsizei maxLength, GLsizei *length, GLcharARB *source);
	void APIENTRY glGetShaderSource (GLuint shader, GLsizei bufSize, GLsizei *length, GLchar *source);
}
glBindAttribLocation {
	void APIENTRY glBindAttribLocationARB (GLhandleARB programObj, GLuint index, const GLcharARB *name);
	void APIENTRY glBindAttribLocation (GLuint program, GLuint index, const GLchar *name);
}
glGetActiveAttrib {
	void APIENTRY glGetActiveAttribARB (GLhandleARB programObj, GLuint index, GLsizei maxLength, GLsizei *length, GLint *size, GLenum *type, GLcharARB *name);
	void APIENTRY glGetActiveAttrib (GLuint program, GLuint index, GLsizei bufSize, GLsizei *length, GLint *size, GLenum *type, GLchar *name);
}
glGetAttribLocation {
	GLint APIENTRY glGetAttribLocationARB (GLhandleARB programObj, const GLcharARB *name);
	GLint APIENTRY glGetAttribLocation (GLuint program, const GLchar *name);
}
wglChoosePixelFormat {
	BOOL WINAPI wglChoosePixelFormatARB (HDC hdc, const int *piAttribIList, const FLOAT *pfAttribFList, UINT nMaxFormats, int *piFormats, UINT *nNumFormats);
	int APIENTRY wglChoosePixelFormat (void); // ignore this def, custom-modified
}

So anyway, if you’re targeting a core version, use the core function; otherwise the ARB, otherwise the EXT. :slight_smile:

But just out of curiosity, could someone name at least just one difference between ARB_geometry_shader4 and GL 3.2 off the top of their heads (other than the names of the types)?

The ARB extension requires that you use glProgramParameter…() to set the input/output types and number of vertices, while the core version additionally allows you to set them in the shader, via

layout(triangles) in; 
layout(triangle_strip, max_vertices=3) out;

That may not seem like much of a difference, but in practice it’s much easier to write, debug and maintain.

I don’t think you lose an install base since a driver that supports the first version of GLSL (GL_ARB_shader_objects, GL_ARB_vertex_shader, GL_ARB_fragment_shader, GL_ARB_language_100) is probably very old. You will find them for Geforce FX 5200/5500/5800/5900 and Radeon 9700/9800.

The user should probably update their driver and they will end up with GL 2.1.

If you want, you can ignore the GL version entirely and work with extensions of course. That is another “school of thought”.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.