Matrox Parhelia GigaColor

Hi,

I was wondering if anyone out there could help me with some information on Matrox’s GigaColor technology.

I was wondering how the 10:10:10:2 frame buffer is exposed. I have briefly tried the board under Linux but there is no 10:10:10:2 visual exposed here.

How is the GigaColor technology exposed under Windows? I would suspect something like a 10:10:10:2 frame buffer scanned out with a 10bit RAMDAC. If this is the case, how does this improve image quality when using a DVI link (which is 8bit)? Is Matrox using temporal modulation to scan out the frame buffer?

– Niels

ATI supports it, too, but I have yet to try and use it. From what I’ve heard, though, it only works in fullscreen modes.

I have also found some information on the ATI support but I’m really looking for a small example on how to use the 10 bit frame buffer from OpenGL - especially I’m interested in how to use this from Linux.

The other issue I have with the GigaColor technology is how the frame buffer is transmitted using DVI. As far as I can tell, you cannot scan out a 10 bit frame buffer using a single link DVI. The DVI specs does however mention support for higher color depth when using dual link DVI. The higher color depth could however be implemented on a single DVI link using temporal frame modulation. Anyone has any information on how this is done?

– Niels

[This message has been edited by Niels Husted Kjaer (edited 12-21-2003).]

Just popping the question back on top:

There must be someone out there with a Parhelia board.

Dorbie do you have any information on how the 10bit frame buffer is scanned out across the DVI link?

– Niels

bump

Can anybody tell me how this can actually be used for fullscreen OpenGL on Win32?

ChangeDisplaySettings’ DEVMODE structure only has a single unified count for color bits after all. How can I distinguish 8-8-8-8 modes from 10-10-10-2 modes when they both sum up to 32?
Or is this just incompatible with plain CDS mode switching?
Just for the record, I can see these modes listed for fullscreen in various DirectX Graphics based software (3DMark et al).

S3 Graphics is supporting this, too. I’d really like to try it out.

i’m confused? so there are monitors now that do 10bit colour modes? what is the last number by the way… overlay alpha blending?

btw, just curious if overlays are supported well on consumer class machines these days? that is say so you could just swap a gui overlays buffer, while not touching the primary display buffer. setup opengl contexts for different pixel formats, etc?

Originally posted by michagl:
i’m confused? so there are monitors now that do 10bit colour modes?
Not “now”. They’re just dying out. VGA is an analog connection and CRTs are analog devices. They don’t care about bits.

Originally posted by michagl:
what is the last number by the way… overlay alpha blending?
It’ s destination alpha. This is about framebuffer formats.

Originally posted by zeckensack:
[quote]Originally posted by michagl:
i’m confused? so there are monitors now that do 10bit colour modes?
Not “now”. They’re just dying out. VGA is an analog connection and CRTs are analog devices. They don’t care about bits.
[/QUOTE]So true. Nowadays everybody goes to these shiny LCD, most of the time having only 6 bits for each color channel, and only on best cases of britghness/contrast…

Originally posted by ZbuffeR:
[quote]Originally posted by zeckensack:
[quote]Originally posted by michagl:
i’m confused? so there are monitors now that do 10bit colour modes?
Not “now”. They’re just dying out. VGA is an analog connection and CRTs are analog devices. They don’t care about bits.
[/QUOTE]So true. Nowadays everybody goes to these shiny LCD, most of the time having only 6 bits for each color channel, and only on best cases of britghness/contrast…
[/QUOTE]i’m more confused now.

what is meant by “dying out”? yeah crts are analog, but drivers must decode some memory to get the outgoing signal… and i’ve never heard of a greater than 8bit per channel encoding. and i believe the actual colours realizable by crt don’t really approach 8bits… or maybe its the human eye which does not approach 8bits?

but lcds don’t seem much better picture wise… though it seems like their colour is more dependable across various devices. the lcd screens seem to have limited resolution modes, and smaller resolution in general. the bigger models don’t seem to be much lighter than an equivalent crt. though my portable weights in 1.5lbs in all, so i really don’t understand why the larger lcd monitors seem to be so heavy.

the dvi is a digital signal, which i figure is compressable at least for streaming media like dvd movies, and feeds. is there any equivalent for graphics cards compressing their output, or is that unreasonable?

so if 10bit encoding is ‘now’ dieing out? and common lcd screens do 6bits? i really don’t get it.

it sounds like gigacolor would up the colour resolution at the expense of alpha compositing. if the human eye can’t do 8bits, then what is the use of 10bits? unless maybe it gives you better control over adjusting brightness etc.

edit: ok, i think i get it now… dying out vs. dieing out. am i reading this right? still what good is 10bits really?

Originally posted by michagl:
[b]i’m more confused now.

what is meant by “dying out”?[/b]
I meant that more and more people are buying flat panel displays now and that CRTs are no longer “the” display technology.

yeah crts are analog, but drivers must decode some memory to get the outgoing signal…
Certainly not the drivers. The hardware does it. Wiki up “RAMDAC” if you’re curious. Due to the requirements for proper gamma correction, RAMDACs are regularly more precise than 8 bits per channel, and have been so for a while already, even if the supported framebuffer formats are not.

and i’ve never heard of a greater than 8bit per channel encoding.
Well, what Matrox calls Gigacolor is one such encoding, and that’s what this thread was and is about.
Other vendors support the same.

and i believe the actual colours realizable by crt don’t really approach 8bits… or maybe its the human eye which does not approach 8bits?
I do, and no.

But even if you couldn’t display more bits and not see more bits, there’d still be a use for more framebuffer precision. Frequent blending for example, because it accumulates rounding errors.

And all the fancy modern “HDR” and tone mapping techniques of course.

edit:

Originally posted by michagl:
edit: ok, i think i get it now… dying out vs. dieing out. am i reading this right?
Heh, I think so. As in “approaching death”, not “getting colorful”. Sorry, I’m not a native English speaker.

still what good is 10bits really?
It’s more than 8 bits!!!1 :frowning:

Originally posted by zeckensack:

[quote]Originally posted by michagl:
edit: ok, i think i get it now… dying out vs. dieing out. am i reading this right?
Heh, I think so. As in “approaching death”, not “getting colorful”. Sorry, I’m not a native English speaker.
[/QUOTE]Actually dying is the correct spelling (and it’s not an american english vs. british english issue).

/A.B.

Just noticed someone asked me specifically for info on this. I don’t have any specifics, and I don’t know why anyone would expect me to either :-).

In general 10 bit framebuffers might help even with an 8 bit display interface limitation because there’s the possibility of applying gamma correction and other color space conversions before you go to the lower precision and preserve better precision within the 8 bit available range. It has the potential to help even with a digital interface, but any 8 bit limit may still hurt. 10 bit to analog would be better in this regard if the RAMDACs etc. were up to it, always assuming your gamma LUT had sufficient precision out especially at the low end (in both cases).

Asside from this there are all sorts of high dynamic range rendering approaches and with them comes the concept of a transfer function as part of an “exposure” stage including filters (faked or otherwise). This often happens before the backbuffer or texture buffer is displayed to the frontbuffer, which on windows can have limitations of its own with non muxed contiguous deskstop display memory of limited precision. I assume theres some way to build hardware to mux windows buffers or just go full screen. There’s a lot of stuff workstations took for granted that have been the dirty little unmentioned secret of a lot of compromised windows rendering implementations for years.