what i’m trying to code is a gl font renderer that supports the standard unicode charset (65536 characters). So i made a program that saves all the glyphs of a font in a file with the GL_BITMAP format: the program uses gdi to draw the glyphs in a bitmap with a specific size and them constructs a GL_BITMAP for that char. So with that file i have to code the font renderer class. First i tried to render the chars with glBitmap but this didnt work because i cant resize the glyphs this way (not affected by PixelZoom). Second i tried constructing textures for the glyphs but that also didnt work: even with a caching techinique (the glyphs are created only the first time they are needed) this consumes so much vid mem and requires a lot of texture changes. After that i tried glDrawPixels which seemed to solve the problem: i can utilize the data in the file as is (i bit per pixel), it’s easy to cache in display lists and can be zoomed with glPixelZoom but seems not to work with the color buffer! Specifing GL_BITMAP as the data type for glDrawPixels doesn’t work as would work with glBitmap: where there’s a 1 in the bmp generates fragments with the current color.
Does anyone have any idea on a gl feature appropriate for such a rendering class?
If glDrawPixels does the job for you and your only problem is the current color stuff, you can simply set your current color to opaque white and restore it after drawing text.
glPushAttrib(GL_CURRENT_BIT); //this pushes current color among other things
draw text here (many glyphs, not one)
glPopAttrib(); //get back previously current color
I’d prefer to not have this stuff in every display list.
Take a look at wglUseFontBitmaps and wglUseFontOutlines (if you haven’t already).
Have a look at…
This lib has bitmap, pixmap, texture, polygon, outline, and extruded fonts. It’s very efficient by only creating data for chars that are needed and it’s reasonably fast. In a recent project for a chinese company all the required chinese chars fit in less than 2 megs of texture mem.
If for some reason this doesn’t do the trick get in touch and I’ll help you out. What I don’t know about font rendering isn’t worth knowing.
PS zeckensac. This doesn’t work because draw pixels just blits pixels from a buffer. The current colour state (and a whole more) is ignored, so if you need a different pixel colour you have to change it in your buffer It tried all sorts of methods to get around this but failed.
as henryj already said glDrawPixels ignores the current color state… and more: if you set GL_RGBA as the format for glDrawPixels then the GL_BITMAP type cannot be used (the gl specs say glDrawPixels will gen an error if GL_BITMAP is specified as the type param and format isnt GL_COLOR_INDEX or GL_STENCIL_INDEX). And if i’m not wrong wglUseFontBitmaps supports only the 256 ansi chars…
i’ve checked the font class at http://homepages.paradise.net.nz/henryj/code/index.html . It seems very usefull and efficient but there’s a problem: i was trying to write a size-independent font to be possible to pass a size param at each drawtext routine. The referred class allows font resizing but it seems to resize the data buffer itself (and not let the hw do it with specifing e.g. a pixel zoom) so for every different font size i’ll need a different char buffer data… yes if i dont find a different way i’ll use this method but do you know if it’s possible to load a single data buffer (with defines a char with specifics width and height) and specifies size changes only for the rasterizer (like glPixelZoom: the data isnt changed; only the raster renders it with different sizes) ?
FTGL (the above mention font lib) tries it’s best to behave just like any other openGL primitive and while I haven’t tried it myself, I don’t see any reason why you couldn’t create a pixmap font and then use glPixelZoom before each render call. I know glScale works fine with the texture and poly font types.
I Draw alpha (antialiased characters) and before rendering them I use :
glPixelTransferf (GL_RED_BIAS, BIT8DIVIDE(red));
glPixelTransferf (GL_GREEN_BIAS, BIT8DIVIDE(green));
glPixelTransferf (GL_BLUE_BIAS, BIT8DIVIDE(blue));
So that it’s drawn with the color I want.
I’ve also found that it’s faster to compose a sentence bitmap on the fly and draw it after that with a single gldrawpixels rather than draw each character.
(gldrawpixels is faster if called one time with a big buffer than several calls with small buffers => In software mode, that is Mesa. With hardware texture drawing is much faster and easier)
If you’re on Windows and want to draw text, the best way to do it is to draw the text you want into a DIBSection, and then upload those bits as a texture. You could do this everytime the string you want to draw changes. The Windows font rasterizer is very fast, and as long as you don’t change an entire screen of text every frame, the texture upload is negligible. And drawing a single quad sends WAY less geometry than drawing a quad per glyph…
Note that to get Windows to anti-alias a font, you have to first select the font into a physical device context before selecting it into your DIBSection DC the first time. There’s lots of notes on MSDN about this.
The pixel transfer stuff is one thing i never tried. I might give it a go. Thanks.
I’ve been thinking of providing a way to build strings. Initially as a method for mapping text onto arbitrary geometry. It will also provide performance benefits. I cache everything because in all the tests I’ve done the texture upload IS costly, even using glTexSubImage. Maybe I’m doing something wrong??
i also didnt think about using pixel transfer or logical ops… i’ll give them a try and as soon as i get a solid result i’ll let you know.
All Pixels transferring are always costly (drawpixels or texture upload) but the size of data transferred matters a lot, as graphics pipeline may cause transferring initialisation and closing taking more time than transfers inserting there a huge overload. That’s why I suggested to build string instead of characters transfers, as it maximize transfer speed by reducing opengl pipeline overload…
But for now, My string building algorithm is also costly as it composes bitmap on the fly and then transfer it…
Building string is complex because of baseline alignement, bitmap buffer size calculation of the final bitmap… so for now I’ve just manage to find one that is 2-pass…
Hope I found better one…
But for now, I get an average 20% performance improvement over cached glyph per glyph drawing.
The major advantage is when I display the same text again and again, as the bitmap is already built nad then speed improvement reaches 40%
The major problem that I found in caching all fonts glyphs is in memory consumption… I also draws unicode characters, and I found that I can only put three fonts in my video memory card (gltextureareresident tells you that).
But my application (an opengl web browser) typically use up to 10 fonts !!!
So those cached fonts takes 100 megs of RAM !!!
And the time that I take to build those cached fonts is also big.
That all my XP on opengl fonts.
Henrij, I must thank you for you work as it gives me the way to do, FTGL is a great lib that everyone using C++ and opengl fonts must use !!!
As soon as I’ve finished my optimisation, my C code will also be available to everyone
[This message has been edited by Kuranes (edited 03-05-2002).]
I get round the memory and load time problem by creating the glyphs when they are required. This saves heaps of ram (especially with unicode fonts) because it’s rare that someone will need the entire character set, but as you know rendering individual glyphs isn’t the fastest way to go. As with most things in life, you can’t have it both ways…yet
i coded the font class with glPixelTransfer operations to set the font color. what i did was: in the pixel array set to 1,1,1,1 every pixel that composes the char; otherwise set the pixel color to 0,0,0,0. So, by default, all characters would be rendered with white color. When i want to change the char color i do:
glPixelTransferf( GL_RED_SCALE, red_value );
glPixelTransferf( GL_GREEN_SCALE, green_value );
glPixelTransferf( GL_BLUE_SCALE, blue_value );
glPixelTransferf( GL_ALPHA_SCALE, alpha_value );
and set an alpha test to excludes alpha values equal to 0.0f. Results: perfect! the unicode chars are rendered at low cost with any color/alpha blending options. Thanks to everybody who has posted an idea!!
BTW does anyone know a way to antialias the results of glDrawPixels (without multisampling, please) ?
This may be one area where rendering yourself (ie software) to a texture may be beneficial. You can get filtering with glDrawPixels but AFAIK that requires the imaging subset.
A reasonable approach might be to subdivide the screen into 64x64 tiles and to create an alpha texture for each tile. Then you check whether the text in that subregion has changed since the last frame and update it accordingly, reupload the texture and render it with blending and/or (depending on your background) alpha test enabled. You can also enable bilinear filtering if you want ‘blurry’ glyph edges.
Note: 64x64 is not the optimal size for this, 1024x16 or so might be better, since you could cover lines of text or strings. OTOH 16x1024 might be better for top-down languages (I believe japanese is one of them ). But you may end up running on an OpenGL implementation that allows only the bare minimum required by the spec, that is, texture sizes between 64x64 and 256x256.
The texture memory required for this would be roughly 1 byte per screen pixel. And your app must somehow cache the textual contents of each tile to prevent redundant generation and uploading of textures.
To get antialiasing with gldrawpixels, I use freetype 2 (freetype.org) to get an antialiased glyph (as an alpha bitmap.) of each character I draw.
I draw it as an alpha bitmap, and colorize it with the gltransfer mechanism for RGB, but NOT changing the alpha transfer. Then it is drawn with antialiasing. After that you can Play with alpha test and alpha func for visually decide of the smoothness of the font (Particularly if you load glyphs with freetype hinting)
Alpha bitmap => gldrapixels with GL_ALPHA format and GL_UNSIGNED_BYTE type;
One byte/pixel that just gives the alpha value of that pixel.
Henrij antialisased pixmap obect in his FTGL library use that mechanism (without freetype hinting, I guess).
[This message has been edited by Kuranes (edited 03-07-2002).]
if i use an alpha map for the glyph it won’t be antialiased with high values for glPixelZoom (let’s say 10.0f, 10.0f) because the alpha map will also be zoomed… what i need is a raster antialiasing (like that done with polygons: glHint…) because this way the antialias is zoom invariant. Do you know a way to do it?
If you want to do this with gldrapixels, you’ll have to use your own antialiasing method… If you seek some, look at freetype 2 or libart library code that had some in their source code.
Otherwise the easiest solution is to use texture, as you can zoom without altering alpha values.
But I found it strange to zoom on glyph… Can’t you dynamically load glyph with a higher font size ? I thinks you’ll get better results like that…
(and it can be as costly as pixelzooming and doing antialiasing in software…)