How to compute the time spent by AGP8X?

I have a AGP 8X graphics card.When I use glReadPixels to read a 512X512 floating point buffer.The time cost is 22.8ms.
But according to AGP8X,the transfer speed is 2.134GB/S, so the time cost should not exceed 2ms,that is 512X512X16/2,134,000,000=0.00196ms.
why does this happen?
Any reply will be appreciate.

Readback speed depends a lot on your graphics hardware.

The Geforce 6 can readback at about 1Gb/sec, previous generation geforces about 200 mb/sec.

The latest generation ATI’s are about 200 mb/sec, the previous generation was about 80mb/sec.

The optimal format for readback is BGRA.
Reading floating point buffers may not be as well optimised in the driver.

You will get a little extra performance if you use the PBO or PDR extensions.

There are a lot of posts about readback performance in this forum, try using the search.

Readback on AGP is OFTEN very limited, because it OFTEN uses PCI cycles, not AGP cycles. If you want fast readback, you want a PCI-Express motherboard and PCI-Express graphics card.