How to show float numbers?

float t;
char buffer[256];
sprintf( buffer, “%3.2f”, &t );

printf("%f", …) expects double, not a float.

sprintf( buffer, "%3.2f", (double)t );

Ding, maybe you too overlooked it in your code?

[This message has been edited by Carmacksutra (edited 05-08-2002).]


For me the main bug is the use of ‘&’ !

You give a pointer when using scanf, not printf !!!

The “printf” takes the value itself and it is defined as using the ellipsis “…” to tell that there is an unknown number of parameters.

All floating points values are converted to doubles when using the ellipsis.

Anyway, doing:

float t=2;


double t=2;

will produce the same result (and in the first case, t will be converted to a double when passed to printf !).

Now, when you use scanf, that’s another matter !

float f;
double d;

And the “l” indicates that the pointer points to a “long value” (i.e. long int or double).

Anyway, to go back to Gorg’s piece of code, it should be:




For me the main bug is the use of ‘&

I thought that “bug” was so obvious that it had to be a typo.

All floating points values are converted to doubles when using the ellipsis.


Thanks for enlightment…
(i was really consequent with this “(double)t” in my code )

To be honest, I found that only some weeks ago while I was looking at a way to create a function with an unknown number of arguments and it stayed somewhere in my memory…

It’s actually interesting to read the docs about the ellipsis in MSDN (ok, it has nothing to do with OpenGL but anyway !).



For a lot of compilers printf is type safe:
(e.g. they will bark if you try)

char *mychar = “blah”;

printf("%f", mychar);

Warning:Officious compiler thinks you are not very smart and is warning you that there is a type-mismathc in line 3 (printf).

This is usually ok, if you want to do this on purpose you can use a cast and get what you want:

float var;

printf("The dang address of the float is:%08x
", (int) &var);

To position a string correctly, you need to call GetTextExtendPoint() to find out the size of the string.

But there is still a problem: different cards have different implementation on text drawing. I develeped my OpenGL program on ATI Radeon 8500. The text position is always wrong when running on any VisionTek-nVidia cards.

Eh? What’s text formatting got to do with what card you’re using?