Getting a texture map of the Windows Desktop

I’d like to somehow have an interactive copy of the windows desktop running on a polygon in my OpenGL generated world. I’m not sure it’s even remotely doable, but can anyone think of a reasonable way to pull it off?

Peculiar thing to want to do… :slight_smile:

You can certainly get a device context for the desktop, copy it into a texture and map it onto your poly. But “interactive”? Desktops are big, and if the dimensions aren’t a power of 2 (e.g. 800x600) then you’ll have to scale it or fudge the texcoords. Reading, creating and uploading the desktop texture every frame is going to kill you.

That’s kind of what I figured.

The reason I wanted to do it is that I’m doing some research into 3D interfaces. The current incarnation uses head tracked shutter glasses and a Sensable Technologies Phantom haptic device for input/force output. I’ve got it set up so that there is a hexahedral working volume in which the user builds simple 3D models. It’s like you’ve got a box, with the near side being the computer screen, that you can work in. To illustrate the concept of anchoring UI components to the “walls and floor” of the working volume, I texture mapped a screen grab of my desktop onto the lower quad. It just got me thinking how cool it would be to be able to actually have your windows desktop running inside this virtual workspace.

So there’s your long-winded explaination of why I was considering doing such a wacky thing.

[This message has been edited by Rob (edited 05-01-2000).]

Hi Rob,

I’ve played around with a similar idea. Check out: http://www.ac.com/services/cstar/cstar_publications.html

and look for:
Augmented Workspace: The World as your Desktop
Kelly L. Dempski
First International Symposium on Handheld and Ubiquitous Computing (HUC '99), 27-29 September 1999, Karlsruhe, Germany
[Abstract] [Paper (PDF) (ZIP)]

The version mentioned in the paper grabbed shots from one screen of a multimonitor setup and rendered them on the other. The killer was the fact that you had to move the image from one video card, over the bus, and to the second card. When I captured small notepad windows, I got reasonable framerates, anything larger was too slow.

However, I have very good reason to believe the following will work:

On the SGI 540 visual workstation (which will be unavailable soon), you should be able to capture a full desktop (or several windows if you work on a per window basis), use them as textures in your scene, and feed them out the video out port as NTSC signals to your glasses (through a converter if need be). Of course, your final output will be limited to the NTSC (or PAL) resolution, but the framerates will be decent. That’s the best thing I can think of, because on the 540, you never leave the video subsystem. If you know of any other solution that could use two pathways without going over the bus, let me know.

I have tried to copy the desktop DC to a texture, but i cant get it to work. Do you know how to do it?

How did you try it? Did you use GetDesktopWindow(), getDC(), then use BitBlt() to copy.

This is how i made it work. Is it possible to copy the bits without having to swap red and blue values (because of the bitmap format)?

hScrDC = CreateDC(“DISPLAY”,0,0,0);
hMemDC = CreateCompatibleDC(hScrDC);

hBitmap = CreateCompatibleBitmap(hMemDC, width, height);

hGDIOBJ = SelectObject(hMemDC, hBitmap);
BitBlt(hMemDC, 0,0, width, height, hScrDC, 0,0, SRCCOPY);
hBitmap = (HBITMAP)SelectObject(hMemDC, hGDIOBJ);

GetBitmapBits(hBitmap, widht*height, lpBits);

// swap blue- and red values

glTexImage2D(…, lpBits);

thanks
radde

[This message has been edited by radde (edited 02-25-2001).]

>>Is it possible to copy the bits without having to swap red and blue values

Yes, see here: http://oss.sgi.com/projects/ogl-sample/registry/EXT/bgra.txt

When i grap the desktop image to a texture and render it on a polygon, i notice the number of colors are reduced.

I have a 24bit color picture on the desktop bkground, but the texture map appears to have 16bit colors.

Does opengl work with 16bit colors internaly (my drivers maybe?), or might something be wrong with the copying?

radde

Yes, that’s possible.

Check your control panel. I know there exists a switch for this on Radeon, GeForce, and maybe others as well. There might also be a switch for automatic compression, whatever that uses, try to disable that too.

And specify a more specific internalformat parameter in the glTexImage2D call like GL_RGB8 or GL_RGBA8 instead of 3 or 4.