I’m not sure if this is an appropriate topic for this forum, but I recently added a graphics card (a Zotac GeForce GT 710) to my desktop computer (a HP P6230F) which already had an integrated CPU/GPU chipset (an AMD Phenom II X4 810 Processor) that supports OpenCL. My expectation was that I could continue to use the integrated CPU/GPU to drive the monitor and only use the graphics card to run programs I’d written in OpenCL, which don’t necessarily have anything to do with graphics. But after installing the card, the integrated CPU/GPU was no longer recognized by Windows as a display adapter (it doesn’t show up under “Display adapters” in Device Manager), and I had to plug my monitor into the graphics card instead of the motherboard. However, when I run GPU Caps Viewer, it still lists both devices as being available.
Is there some way to force Windows to use the integrated chipset for video output instead of the graphics card? I looked for some setting in the BIOS to do this, but couldn’t find anything. I’m able to run my OpenCL programs, but they run more slowly than they did before I added the graphics card (when they ran on the motherboard’s CPU/GPU chipset). I realize that the graphics card isn’t very fast by today’s standards, but I didn’t expect it to actually run more slowly that it did on the host.