wglUseFontOutlines bug !

Trying to help someone port an application from win32 -> OGL

But we have a big problem with the font rendering. When the nEscapement member of CreateFont http://msdn.microsoft.com/en-us/library/dd183499(VS.85).aspx is set. The font becomes titled the right way, only the line of text goes off at the wrong angle. I compared normal win32 rendering with the outline font and it’s definitely a bug.

wglUseFontBitmaps is also broken when the nEscapement member is set, although in a different way.

I don’t think it’s wrong. GDI and OpenGL can have different coordinate systems. For GDI, x+ goes right, y+ goes down.
OpenGL goes to whatever your viewing matrix is set, like x+ right, y+ up. You probably just have to negate y or something.
So, if something is turning the wrong way, negate the angle too.

I tried negating the angle, the problem just happens in reverse then. The angle the text goes and the angle of the individual characters are opposite ! This is totally unlike the win32 version. I’ve tried swapping the up/down on the projection matrix to convert from win32->ogl coordinates, but then the text just renders upside down. This is definitely a bug.

The MSDN docs say you need to set nEscapement and nOrientation to the same value.

nOrientation has no effect at all.

wglUseFontOutlines does ignore some CreateFont parameters.
Perhaps OpenGL can only render font characters them separately? It creates the characters in a display list format.
If so, would probably have to do your own rotation transformations then, using glRotatef().

When you set nEscapement it is meant to set both nEscapement and nOrientation. It correctly rotates the characters, but the text seems to get the inverse rotation which is bizare. How can we get Microsoft to fix this ?

Sure, but I think it’s applying the rotation separately to each character, but not to the entire string.
Remember, these are pre-compiled display lists. The list is created once. All characters are treated the same,
created alphabetically in order. How would OpenGL know which character is tranformed first in the display list?
Or second? It can’t. The transformation is left up to you.