Visualization Problem in Opengl

Okay, I have been beating around the bush with questions that are all fundementally related to this issue.

When I render a mesh(collection of triangles) in opengl, w/ one color, say blue. The resulting shaded version(phong specular shading) is only limited to 256 color shaded gradient.

Normals are fine. All settings fine.

This results in ugly banding effects. Now, so far, from the forum info, and google search, 128 bit color is not an option. Choosing an all blue color palette is also not option for index coloring.

Is it possible to render, using one color only, the full range, 32 bits, so I can get access a better results. Tried interpolating colors(false color display), but finding the crossover region is trial and error.

Suggestions. Thanks.

Well, first, try to be more precise on what you’re saying.
Second, I’m not sure this is the good forum.
Third, try not to use index colors, but normal rgb(a) colors. Give values between 0 and 1.

If I’m totally wrong, please be more explicit. We’re not all english :slight_smile:

It may not be an option for you, but you could try using SetDeviceGammaRamp to concentrate precision in the range of intensities where you’re getting the worst banding.

Jide, I believe I was quite precise about my problem. If you set a single color, and then set your normals, and enable standard opengl shading(phong specular shading), there is a color banding effect, because you only have a total of 256 colors in the blue range. I never said I used, color index mode, I simply mentioned it with relation to color palettes.

Yes, Mikef, that was the kind of function, I was looking for. I will give it a try. I am not sure if my card supports it. If anyone else has any suggestions, please let me know. Thanks.

there is a color banding effect, because you only have a total of 256 colors in the blue range.
there shouldnt be any noticable banding, perhaps your window set to 16bit color (query the window dont assume u get what u ask for)
enable dither, perhaps try green instead of blue the eye is more receptive to it (though this will make banding more apparent wouldnt it zed)

there shouldnt be any noticable banding
Having only 8 bits available/channel for output (almost?) anyone can easily see transitions.

Referring to the recent discussion about LCD’s, it would seemingly be even worse using an LCD if they indeed are unable to display more than 6 bits of shades. That would certainly produce very visible banding.

To be able to display a linear gradient (of single-channel values) from 0.0-1.0 without any visible banding, I suspect a 10-bit framebuffer is required.

Having only 8 bits available/channel for output (almost?) anyone can easily see transitions.
wanna bet goto a paint program draw a large square of 0,0,255 and right beside it one of 0,0,254. its very unlikely u can tell which side is darker, u may see a stripe down the middle where the 2 colors join but this is most likely a mach band

what about the normals- do you use the same normal for all vertices of a triangle? if you want to have it smooth, you have to compute the normal of a vertex as the average normal of all adjacent triangles.

Originally posted by zed:
[quote]Having only 8 bits available/channel for output (almost?) anyone can easily see transitions.
wanna bet goto a paint program draw a large square of 0,0,255 and right beside it one of 0,0,254. its very unlikely u can tell which side is darker, u may see a stripe down the middle where the 2 colors join but this is most likely a mach band
[/QUOTE]And that would not be a visible artifact, only more backing up my statement that 8 bits is clearly not enough? :slight_smile:

I know a gamedev.net article isn’t scientific proof, but I think it’s an easily digestable piece:
http://www.gamedev.net/reference/articles/article2208.asp

Still, I stand behind my statement that most humans can see transitions points if using only 256 intensity levels. You might not see a difference between 254 and 255, or 0 and 1, it all depends on way too many variables to list here. But with normal vision a human will easily spot a number of transition points for only 8-bits/channel (using even just a reasonably good CRT).

Granted, for many (probably the overwhelming majority of) applications this might not be visible, due to e.g. mixing with other channels, output devices that blur, bleed or otherwise have a limited range and so on, but for some applications I can easily see 8 bits are not enough. Perhaps it should also be kept in mind that even if the vast majority only use CRT or LCD, there are other output devices.

I just got some ideas re. this. What follows is a bit of brainstorming - and as such possibly even more OT in regards to OpenGL.

I wonder how such higher resolutions frame- and texture storage should look, to provide the depth but not waste excessive amounts of memory.

If we assume 10 bits would be enough, then clearly a 32-bit storage unit for each pixel would not be sufficient as it would only leave 2 bits for alpha. On the other hand, the next currently reasonable transition point would be 48 bits, or in the worst case 64 bits. I think 18, or 34 bits of alpha would be a bit excessive. :slight_smile:

Could, perhaps even should, such future OpenGL servers use a more unconventional layout? Like 40 bits/pixel? Perhaps even use 12 bits/channel, to make it nice and even 48 bits? Could perhaps 4*10 be enough, leaving 8 bits for more interesting usage we have yet to consider or even imagine be usable?

Hm, I was doing such test(unicolor shading) with a sphere about a year ago and I didn’t notice very much banding. I mean, it will always be there till we get better pixel depth, but it shouldn’t be so visible.
Are you using fragment programs? Which card/driver are you using? It aso depends on driver settings…

When the XP user selection dialog comes(at the startup), there is a blue highlight image in the upper-left corner. I remember a lot of banding on my old computer with CRT monitor and GeforceFX in this image. However, my new setup(I have a cheap TFT and GeForce 6600GT) displays the highlight smoothly.

turn the toon shader off :rolleyes:

i figure 8 bits is as good as it will ever get for strict monitor output. unless we humans develope artificial eyes that can see beyond that. the truth is we can’t even get close to seeing 8 bits, so don’t kid yourself.

different crt monitors are even more limited by their guns, so you might not get the actual colour you think you want. lcds seem more consistant on this basis. but i find it very impossible to believe anyone can see beyond 8bits of colour precision.

the only reason i could see for going up to 16bits would be so you have a large range to play with effects like gamma adjustmant, contrast, colour etc without cut offs on the boundaries.

a unified architecture might eventually just go for a 32bit floating point representation or something just for the hell of being able to quickly mix pixel effects with other effects.

but as far as human eyes are concerned 8bits is the limit. i have read this many places as well.

just for the sake of the original authors intent. i think a screenshot would help. if your model doesn’t look like a typical modern video games output, then there is probably just something quirky going on. there is no such thing as far as 128bit colour as far as seeing is concerned.

if you see banding, it might be an lcd monitors fault, or you have your os or context set to 16bit colour.

Originally posted by michagl:
the truth is we can’t even get close to seeing 8 bits, so don’t kid yourself.

This is not true, the human eye is much more sensitive than that. That you can’t see any banding in a gradient projected on a particular CRT monitor says more about the monitor than your eye.

If I create a gradient of pure 8-bit green on my high quality monitor I can easily distinguish every single field of it.

Read more about the dynamic range of the eye here:

http://clarkvision.com/imagedetail/eye-resolution.html

/A.B.

Originally posted by brinck:
[b] [quote]Originally posted by michagl:
the truth is we can’t even get close to seeing 8 bits, so don’t kid yourself.

This is not true, the human eye is much more sensitive than that. That you can’t see any banding in a gradient projected on a particular CRT monitor says more about the monitor than your eye.

If I create a gradient of pure 8-bit green on my high quality monitor I can easily distinguish every single field of it.

Read more about the dynamic range of the eye here:

http://clarkvision.com/imagedetail/eye-resolution.html

/A.B.[/b][/QUOTE]i will look into it because i’m interested.

but i tried eyeing some gradients. on the lcd screen of an old 300mhz portable, the distinction was very present. i presume it is using 6bits or less for its output. another reason i don’t intend to get an lcd screen anytime soon for a workstation.

on three different crt monitors which range from very bright to very dark. a grey scale i can discern within the maximum range of about 3 shades. there is an oscilating banding effect which i imagine might be the cause of this thread.

for green however on crt, i could go like 20 shades without being able to make any discernments. especially in the very green and very light regions. the middle was easier to pick out, but still impossible for up to 10 different shades. same held for all of my crts. this could to some degree very well be an effect of the crt. i’ve noticed when using a regularly spaced rainbow spectrum for debuging purposes and what not that some of the colours are much more discernable between the next.

still, i’m overwhelmed by the smoothness of a crt monitor. this might have a lot to do with the light of individual pixels bleeding over into one another.

it looks awesome to me though. and i’m an artist, so i apreciate colours.

the function of the eye though is all chemical… so naturally there are blind spots in its spectrum. maybe in the future people will have cybornetic eyes and whine about monitors with only 256bit resolution… assuming the feed isn’t pumped directly into the eye.

thanks for the link though, i will give it an eye balling.

as for max, i have a feeling the banding you are seeing has nothing at all to do with opengl, and is just your monitor’s doing. for anytime soon 24bit colour ought to suit you fine… that is 8bits per RGB channel, also sometimes called 32bit colour for your alpha channel.

the 128bit pixel buffers are really just for doing crazy shader programming stuff, and are brand new as far as i know.

i guess the best solution is for maximian to post a screenshot.

if I create a gradient of pure 8-bit green on my high quality monitor I can easily distinguish every single field of it.
the eye is more sensitive to green than blue

anyways can u trust your eyes, the center color of the folowing 2 crosses are the same
http://www.uq.edu.au/nuq/jack/Colorcross1.jpg