Multiple fonts problem

Hi everyone!
I am using an std::vector to make a dynamic array of my fonts and be able to search by name. For example, i have this function called int Fonts(LPCTSTR name) which searches in the vector and returns the current index. Anyway, this works excellent after a lot of tries. But my problem is more complicated. Lets say, I create a font:

and I render it using RenderFont(“font1”,Vec2(10,10),“Hello from font1!!”);

In my render routine, I have wglUseFontBitmaps(hdc, 32,96,font[h].base) (h is the return integer from Fonts(name))
and SelectObject(hdc, font[h].hf);

then I render the font based on Nehe’s example on bitmap fonts. The problem comes when I have 2 or more fonts. Then, the fonts seems to switch attributes, for example the first font becomes the second font, etc. Uhh, it is hard to explain, but I hope you can help me here.

Hi !

Are you sure that you set the fontbase so that there are no collisions between the two fonts ?, that’s the only thing I can think of that might give you trouble.


Hey Mikael,
Yes I do set the font base.

in my CreateNewFont method, i use:

font[gfx.nFonts-1].base = glGenLists(96);

This is correct, right?

Any help please?

Originally posted by kaysoft:
font[gfx.nFonts-1].base = glGenLists(96);

This code is suspicious. Are you using vector correctly? Did you set the size of the vector somewhere? Why don’t you use push_back?

You need to call glListBase(font[h].base) each time you switch fonts. Read the documentation, and you will figure out why.

…and I render it using RenderFont(“font1”,Vec2(10,10),“Hello from font1!!”);

In my render routine, I have wglUseFontBitmaps(hdc, 32,96,font[h].base)…

Why are you calling wglUseFontBitmaps in the render routine? This function generates a bitmap for each character within the specified range. You don’t need to do this every time you render a string. Call this in the CreateNewFont routine.

Yup, im using push_back:
rtFont f;
in my createnewfont routine.

Ok, I will try everything you said!

Bah… it didnt work. I think I should post my entire code here:

//—Header file-------//
struct rtFont
TCHAR fName[30];
TCHAR FaceName[30];
int size;
bool bold;
bool italic;
bool underline;
GLuint base;

class CRT2D

int nFonts;
vector<rtFont> font;

//----Source file----//
void CRT2D::CreateBitmapFont(LPCTSTR Name, LPCTSTR FontName, int size, bool bold, bool italic, bool underline)
rtFont f;
strcpy (font[nFonts-1].fName, Name);
strcpy (font[nFonts-1].FaceName,FontName);

font[nFonts-1].base = glGenLists(96);



void CRT2D::RenderText(LPCTSTR Name, LPCTSTR Text, Vec2 pos, RGBA color)
if (text==NULL)

int h = Fonts(Name);
if (h<0) return;

char t[256];
//font to use
SelectObject(hdc, font[h].hf);



//the search routine
int CRT2D::Fonts(LPCTSTR name)
int nIndex=0;
vector<rtFont>::const_iterator ib=font.begin();
vector<rtFont>::const_iterator ie=font.end();
vector<rtFont>::const_iterator i;

for (i=ib; i!=ie; i++,nIndex++)
if (strcmp((*i).fName,name)==0)
return nIndex;
return -1;

Thats it. This rendering code shows nothing on the screen. If I place the wglUseFontBitmaps(…) inside the render loop, it works, but it is all messed up.

[This message has been edited by kaysoft (edited 10-10-2002).]

void CRT2D::RenderText(LPCTSTR Name, LPCTSTR Text, Vec2 pos, RGBA color)
                    //font to use
                    SelectObject(hdc, font[h].hf);

SelectObject is what selects the font, right? This should be called right before wglCreateFontBitmaps. It does no good to select the GDI font after the display lists are already created. I don’t see any other problems with the code.

Thank you so much, that was it!!

No problem. Unless you’re going to use the fonts in GDI calls, you should probably also call DeleteObject on the fonts after calling wglUseFontBitmaps.

One thing though: It seems that I have to call SelectObject, wglUseFontBitmaps in the render loop. Otherwise, if I call them in the createwindow, it wont render anything. Is this the way its supposed to be?

wglUseFontBitmaps creates OpenGL display lists, so you have to call it after you create and select the GL resource context. Is this the problem?

No, that is not the problem. It just seems that I need to call wglUseFontBitmaps in each frame.

Also, this text rendering method is very slow, does anyone know where I can find a faster one? Or point to some tutorials?

Let me explain what wglUseFontBitmaps does. It generates a bitmap (i.e., one bit for each pixel) for each of the characters in the specified range using the currently selected font. Then it calls glBitmap in a display list once for each character. Calling these display lists will then draw the character on the screen at the current raster position. Just like any other display lists, these will still be valid until you delete them or delete the GL resource context. Of course it will be slow if you go through this whole process every time you draw a string. Check the return value of wglUseFontBitmaps to see if it succeeded.

However, drawing bitmaps in OpenGL can still be a bit slow on some implementations. The fastest way I know of to draw text is to load all of the characters for a font into a single texture. Then figure out how to calculate the texture coords for a character. If you understand how to load a texture and use texture mapping, this should be straightforward.

Ok, thanks Aaron for the information!

I think I will go for texture fonts.

Texture fonts are by far the fastest implementation. Have you tried putting a significant amount of text on the screen with the bitmap font you’re using? It will slow to a crawl.

I use glfont, a free easy to use font generating tool by Brad Fish. glfont can be downloaded here: