Problems rendering fonts with freetype and opengl

I’m trying to render some text and I figured I could do it pretty easily with freetype2. My plan was basically to render a character with freetype and cache it in a display list. I put together the following code based on the freetype tutorial and an example in the red book. Unfortunately I
haven’t had any luck getting it to work.

What I would like to know is:
1.) Is this a valid plan? In other words will it work?

2.) Have I missed some steps in freetype or is it something I messed up in opengl?

Really any help would be appreciated as I’m feeling like banging my head against the desk here.


Some notes about the code:

  • I am error checking I just didn’t include it here.
  • I’m using SDL so that’s where the Uint16 data type is coming from. If I didn’t miss understand what I read about unicode I believe it should give me a unicode value from ascii.
  • I realize that I’m not taking into account a glyph’s origin. Right now I don’t care. I just wanted to see some text on the screen.
  • I wasn’t sure if I needed a call to glPixelStorei(). I suspect I do.

void InitFont() {
FT_UInt index;
Uint16 i;

FT_New_Face(library, “fonts/times.ttf”, 0, &face);

FT_Set_Char_Size(face, 0, 22*64, 0, 0);

fontOffset = glGenLists(128);

for (i = ‘A’; i < ‘A’ + 4; i++) {
index = FT_Get_Char_Index(face, i);

FT_Load_Glyph(face, index, FT_LOAD_DEFAULT);
FT_Render_Glyph(face->glyph, ft_render_mode_normal);

glNewList(fontOffset + i, GL_COMPILE);
0.0, 0.0,
(float) (face->glyph->bitmap.width + 2), 0.0,



void printGL(char *string) {

glCallLists(strlen(string), (GL_UNSIGNED_BYTE,)
(GLubyte *) string);



void DrawScene() {
… Other opengl stuff …

glRasterPos2i(20, 100);

… Swap the video buffer …

In your InitFont() you might want to add an int which holds error value for FT_New_Face call. this can give you some info. Good luck!