I am desperate for help!!!

I have no idea what happened. My friend and I were playing games on LAN all weekend, and his computer started having graphical distortion on all his games. I have completly redone his computer and it is still messed up. I swapped his GeForce 4 for my old GeForce 4 and everything worked fine. I assumed his card was fried.
Then my computer started doing it several days after we were off the network, but it only screws up my OpenGL games. What has happened? No Call of Duty, Jedi Outcast, Quake 3, or Doom 3!
I need help!

Some details about your system would sure help, but I think its either hardware or driver related.

Did you install a new 3d or chipset driver latley?

Could also that your 3d card is slowly dying.

I have a P4 2.4 GHz cpu, a SOYO motherboard, 1024 mb memory, and a gainward GeForce FX5950 Ultra. My friend has the same mainboard and cpu, and we both recently installed IDE drivers that were supposed to improve performance. I just got rid of mine, but it is still messed up. I also uninstalled a few new programs and it got better slightly. I am currently backing up some irreplacable files so I can get a fresh system. I have never had any problems like this before. I built this computer as well as my friend’s about 2 years ago. I have no idea what’s wrong other than that it has to be OpenGL or driver related.

If you are refering to the Intel Application (de-)Accelerator drivers, I dont recommend them anymore, they only seem to improve performance for P4s that are based on the willamette core, on northwoods I saw the transfer drop from 40MB/s to 20MB/s after I installed them, unfortunatly you cant really uninstall them.

So you are saying that you deinstalled some programs and the situation improved? That sounds odd, the only explanation I have for this is that perhaps the software you uninstalled had background process running that eat up some CPU power. Perhaps not much but maybe enough to increase the heat inside your system about 1 or 2 degree. I know that sounds a little far fetched but I still think that this is a temperature/overclocking problem that will lead to a complete failure sooner or later.

I was begining to think that it was an overclocking problem, but 1. my video card was made for overclocking, and 2. I actually just recently started doing it. My temperature has always been pretty stable, as has my voltages.

I completely restored my computer, and just like my friend’s it was still messed up. I also noticed many people in forums with this same problem. I wonder if it is patchable?

By the way, the accelerator drivers were from VIA, and apparently have no noticable effect on anything.

Originally posted by <g-man>:
1. my video card was made for overclocking

Sorry but I doubt that, your card may come with better cooling but this only compensates the increased heat build up. But the reason the chip produces more heat is because it draws more current and this is just lethal as the heat, only it takes longer and you can see stuff falling apart.

This isnt the place to debate overclocking though, however everyone who does it should be aware of the consequences.

Originally posted by <g-man>:
I wonder if it is patchable?
I still think its your hardware, so no.

Originally posted by <g-man>:
I was begining to think that it was an overclocking problem, but 1. my video card was made for overclocking, and 2. I actually just recently started doing it. My temperature has always been pretty stable, as has my voltages.
Why do people with really nice systems TRASH THEM by overclocking?

I never did understand that… well, unless you’ve got the extra money :rolleyes:

You should take a screenshot of your distortions and put it on the web. A distortion can be anything.

I know this is not the place to say this, but I’m taking some heat here so I’ll just say it. I got the cheaper version of my card that did not come overclocked, but was able to do about the same speeds. I have tested it as stable with speeds a lot higher than others that I’ve seen, but I never keep it like that for any games. I am actuaslly very conservative with my clock settings. And I know that’s not the problem, because my friend stays at stock speeds and has the same game problems.

My friend and I both completely redid our hard drives and the problem was still there. There was no way it was software related. When I first loaded up windows, the first thing I did was put my old drivers back on. Still a problem, though they’ve always worked before. It is not a driver problem.

It suddenly ocurred to me that my friend and I have two different video cards with the same problem, and (here’s the kick in the head)the same motherboard. I restored all my video card setting in the BIOS and then enabled them all one by one until I got the distortions back.

It was a setting called DBI output for AGP. I have no idea what it does, but I had it enabled for a long time with no problems. I’ve actually had it disabled for a while, but when I was at my friends house and went into my BIOS to enable my onboard lan, he told me to enable it, and he did his to.

That’s the last time I listen to him.

Anyway, if anybody knows what that setting does, I’d be interested to know. I am also going to see if disabling it works on his computer.

DBI stands for dynamic bus inversion and is a means to improve signal quality at AGP 8x levels, unfortunatly not every chipset maker goes by the book 100% (especially VIA is known for that).

I dont want to go into the details but this is from the AGP 3.0 spec:

2.1.5 Dynamic Bus Inversion
In order to mitigate the effects of simultaneous switching outputs, AGP3.0 adopts a scheme called
Dynamic Bus Inversion (DBI) to limit the maximum number of simultaneous transitions on source
synchronous data transfers. DBI impacts only AD[31:0] and is used during source synchronous and
common clock transfers. Two new signals are defined to support DBI. DBI_LO and DBI_HI are used to
implement DBI on AD[15:0] and AD[31:16] respectively. The scheme used to implement DBI on source
synchronous transfers is as follows:
Whenever the number of bit transitions in AD[15:0] (or AD[31:16]) from one source synchronous period to
the next exceeds eight, the entire field is inverted by the transmitter in order to limit the maximum
transitions to eight. For example, if AD[15:0] changes from FF10 (hex) in source synchronous cycle A to
0000 (hex) in source synchronous cycle B, the DBI scheme is triggered in cycle B, thus inverting the
AD[15:0] to produce FFFF (hex). In this example, the number of transitions without DBI is nine, while with
DBI is seven. In order to signal to the receiver that the AD[15:0] are inverted in cycle B, DBI_LO will be
asserted high. The same scheme is used on AD[31:16]. DBI_HI is used to signal the inversion. The
receiver samples DBI_HI and DBI_LO to determine whether to invert AD[31:0] before using it.
Contiguous (back-to-back) DBI-enabled data transfers must continue the DBI encoding without break.
The only break that may occur in the DBI encoding happens when more than eight data bits are high in a
strobe group at the end of a transfer and transition low (terminating low, not driving) during the following
idle cycle. In this case there would be greater than 8 bits switching. This case can create more
switching noise. While there is sufficient time during the idle cycle for that noise to settle, the
system designer must be careful to avoid excessive crosstalk and reduced signal integrity on other
signals.
A similar scheme applies to common clock and Frame based (PCI) address and data transfers. In
these instances, DBI applies to transitions from one common clock period to the next.
Implementation of DBI is required in the transmitter and receiver for both the Master (graphics chip) and
the Target (core logic) when operating in 8X speed and in AGP V3.0 signaling mode. When doing Frame
based PCI transfers or 4X speed transfers in the same signaling mode, DBI is optional in the transmitter
but still required in the receiver.
DBI is not supported when in AGP2.0 or AGP1.0 signaling modes. Table 8 illustrates the application of
DBI in various modes of operation.

Even if I ignore the fact that you lied to us by saying you got a 5950 and then admitting that you have a lower clocked version (that would be a 5900 or 5900XT), its still a hardware related problem like I suspected right from the start. I wont even start a rant about people randomly change BIOS settings without having a clue what they mean, all I have to say is:
sigh

I did not lie. Gainward had two 5950 Ultras. The expensive one was tested and shipped overclocked, while I got the cheaper one they did not overclock. I was doing some reading about them, and the only difference between them is the fact that they take some of them and, through their testing, add a setting with a preset overclock that’s under warranty. Other than that they are built the same. My clocks are 475 and 950, it stays very cool, and it’s the only single-slot solution.

Also, I read about that setting before I enabled it a long time ago, and just forgot. I am really busy in college right now and I do not have time to find all this stuff so I figured I would get help from some guys like you.

Thanks for your help. I just don’t care for people who jump to conclusions, so you can keep your sighing to yourself.

As long as you believe your own story, everything is cool dude! :rolleyes:

I like you. You remind me of me!