Program deletes my umlauts!? (GLApp)

Hey everybody,

I have a problem with my 3D project where I’m rendering a 2D text in front of the scene.

I am using the GLApp engine/library (don’t know if you heard about it) and a method that renders the text for me.
An example of use:

yourFont.print(float cordX, float cordY, String text);

So, here’s the code of the method print() of the GLApp library:

	 * Render a text string in 2D over the scene, using the character set created
	 * by this GLFont object.
	 * @see makeFont()
	public void print(int x, int y, String msg)
		if (msg != null) {
			// preserve current GL settings
				// turn off lighting
				// enable alpha blending, so character background is transparent
				// enable the charset texture
				GL11.glBindTexture(GL11.GL_TEXTURE_2D, fontTextureHandle);
				// prepare to render in 2D
				// draw the text
				GL11.glTranslatef(x, y, 0);        // Position The Text (in pixel coords)
				for(int i=0; i<msg.length(); i++) {
					GL11.glCallList(fontListBase + (msg.charAt(i)-32));
				// restore the original positions and views
			GLApp.popAttrib();  // restore previous settings

If I want to output “Grün”, I get “Grn” on the screen instead witch is really weird, I don’t even get a question mark or something!..

Maybe if someone knows GLApp or the way it is rendering the text, he could help?

Changing the font didn’t work. :frowning:

Best regards,

  • phpART

The GL has never had any concept of renderable strings and fonts to pull the glyphs representing that string from. There are third party libs, but most of them do the same: By some means render a texture containing glyphs which represent letters of a certain character set, then render some alpha blended/tested primitive(s) to which portions of this glyph texture is mapped. Character selection can be trivially done by offsetting your texture coordinates.

Since you iterate through the characters of the string msg and use the numeric character value as an offset, it’s most likely that the String object cannot store umlauts with it current encoding. For instance, if it were encoding a string using ASCII, there aren’t umlauts in that character set. You need to use a an encoding that support all the characters you need. For umlauts you could, for instance, use latin1/ISO-8859-1 or Windows-1252, UTF-8, you name it.

Make sure that msg really contains all the characters you need and that the class used for representing that string is capable of doing so. If that’s already the case, check your display lists and make sure all glyphs are correctly mapped to unique indices.