16 Bit Monochrome Texture

AFAIK no hardware support any higher precision than RGB10_A2 as a displayable format. Or maybe there’s some pro-cards that can do more?

lgrosshenning, I just pointed out you were wrong with regards to the precision of texture shaders on the GeForce 3/4 (not MX). Might be useful knowledge since it makes the texture shaders (especially in conjunction with high precision texture formats) much more useful. You are obviously correct in saying that there is no possibility to display a 16-bit monochrome texture directly on most hardware, but I honestly don’t seem what you’re so upset about.

Personally, I did not fully understand what boxsmiley did.

I think nobody really does. Maybe he can tell us ?

I doubt though that he was talking about on-screen displaying, because he hopefully knows that his monitor cannot display more that 8 bit grayscale (unless he has one of those new HDR displays).

BTW: As he was also talking about the GF4 - you could output 16 bit grayscale with a GF4 without any special RAMDAC:

  1. Fetch a filtered 16 bit sample from a LUMINANCE16 texture in the texture shaders.
  2. Lookup into a 256x256 dependent texture to split high and low byte.
  3. Output low byte in e.g. red channel, high byte in e.g. green channel.
  4. Find some display system that puts the two bytes together from the DVI interface again and brings it to the screen with 16 bit precision.

Klaus

thanks everyone for the replies. i think it was a hardware limitation as numerous people have guessed. i was trying to DISPLAY 16 full bits of grayscale data (0-65535) as a texture. i have resolved myself to perform a sort of dynamic range adjustment in order to scale it into the 8 bit limit. i was just convinced that if i could have 32 bit color (as someone said rgba 8,8,8,8) that there must be a way of using only half that storage and devoting it all to monochrome. i guess from the posts that this is a monitor problem. anyways, thanks for trying to help. btw, i am kind of limited to vanilla opengl 1.3, so any talk of shaders and such is not useful in this particular application. thanks for the education people. cheers, boxsmiley.

i guess from the posts that this is a monitor problem
It’s not really a monitor problem, but a RAMDAC problem. See if setting up a RGB10_A2 framebuffer helps. You get 2 more bits per color channel.

Otherwise, I can suggest some creative dithering, either by changing the texture itself, or with a fragment program.

I owe you guys an apology, this is especially true for Klaus and Harsman.

I have been in a pretty fooked up mood lately (its a girl friend thingy) and I was unreasonable and took it out on you instead of blaming myself as I should have.

This is of course no excuse for my extremly rude behavior but I do realize it was my mistake.

I am real sorry guys.

Regards,
LG

Let’s hit a few basic points here:

The RAMDACS on the high end modern PC cards have at least 10 bit precision and they often need it for things like quality gamma correction.

Your monitor is an analog device (CRTs), it can support better than 8 bits, and you could probably see this, exactly how much you can see and under what conditions is an open question but it depends on factors like the scene content, viewing conditions, monitor gamma response and any applied gamma correction.

AFAIK there are limits to what any windows system desktop frontbuffer can display especially with render to a window copy on swap approach. I think you’re basically limited to the display depth on the desktop unless they’ve got a more sophisticated MUX scheme which I think is limited to workstations.

This is a changing situation which seems to get updated everytime you look at a new graphics card.

So if you want high precision you probably have to read the backbuffer before the swap. If you want more out the frontbuffer your best bet is probably to go to fullscreen rendering and keep your fingers crossed but that may not do it, it’s just my best guess.

If any of this is out of date or needs correction I’d very much appreciate an update.

I just feel to add that you need at least 16 bits of framebuffer precision if you want a framebuffer in linear color space (de-gamma’d), so alpha blending finally works as expected.