I’ve got a question about memory usage on video cards. I see all these reviews of the lattest and greatest consumer video cards (usually Geforce2) that have 64mb of memory being compaired to the 32mb versions.
Why do the 64mb versions score higher frame rates at higher resolutions? Since the cards score basically the same at low resolution, it seems that the 32mb version isn’t running out of memory here. Then at higher resolutions the textures aren’t taking up any more room (like so many reviews have incorrectly stated). The only thing I can think of is that the larger frame buffer memory requirement at high resolutions are pushing the 32mb card over it’s limit. But I just don’t think that nearly every game on the market is so close to the 32mb limit that an extra meg or so needed for frame buffers starts slowing down the cards at higher resolutions.
I understand this isn’t probably the proper place to post this question, but I didn’t know where else a bunch of graphics guys who might know the answer are. Plus I hope other people here will benefit from an answer that will hopefully come.
I just started thinking about this the other day and it’s driving me crazy trying to come up with an answer so any ideas on the matter would be appricated.