Personally, I don’t think Text is GL’s responsibility. There are system-specific calls for generating display lists of characters in a particular font, i.e. wglUseFontBitmaps. Which is adequate for most purposes.
Given that GL is client/server, how would the client communicate the necessary font information to the server? Can’t just be a name, since the NamedFont on the client, might not be the same NamedFont on the server. And standardizing fonts to numbers might be a nightmare.
If UTF-8 and C-style strings were adequate, openGL extensions would have chosen to use them. They didn’t so choose for very good reasons, as explained in their respective extensions.
BEGIN DEVIL’S ADVOCATE
void glGenFonts(GLsizei n, GLuint* fonts)
This function generates n font names(as in glGenTextures texture names).
void glDeleteFonts(GLsizei n, const GLuint* fonts)
This function deletes n font names, ignoring zeroes and invalid names.
GLboolean glIsFont(GLuint font)
Determines if font is a valid font name.
void glBindFont(GLenum target, GLuint font)
This function enables the creation of a named font bound to the font target.
target:GL_FONT_2D, GL_FONT_3D
void glFontImprint(GLenum target, GLint internalformat, GLdouble width, GLdouble height, GLenum format, const GLvoid* chars)
target:GL_FONT_2D, GL_FONT_3D
internalformat: GL_RASTER, GL_GEOMETRY, GL_POSTSCRIPT, GL_TRUETYPE
height and width are bounding boxes for the individual characters.
format: GL_RASTER, GL_GEOMETRY, GL_POSTSCRIPT, GL_TRUETYPE
chars is a pointer to the full font’s description of every character. Data is determined by the format.
glText(GLsizei count, const GLubyte* string)
render the string at the current raster pos. If font internalformat is GL_RASTER, then behave like glDrawPixels for every step; is GL_GEOMETRY, then at the current raster position apply the character as glVertex functions; is the others, then behave appropriately for that type.
END DEVIL’S ADVOCATE
Again, there are still problems even with the above. How are unicode characters rendered(if at all)? Does glFontImprint need a length for chars? (Where’s glFontSubImprint?) Is this too slow(probably)? Why are PS and TT font formats and not __ and __ font formats included? How do I get (and communicate to the GLserver) my system fonts?