Reading a technical reference manual for a GPU and I’ve come across the terms “pitch” and “stride”. I’m not familiar with these terms and definitions on the interwebz seems to differ.
For example, the manual I’m reading says “each mipmap level should have a pitch of at least 32 bytes”.
Can anyone clarify the meaning of these terms?
Stride is clear and commonly used. It means bytes in a row of pixels including any padding. So an image could be 510 pixels in width and rounded up to 512 pixels. The stride would be 512 * bit depth of the image’s format. You could also have stride in pixels. in which case the stride would be 512 pixels.
Pitch is a little more vague, but typically could be interchangeable with stride. Considering your reference is specifying in bytes, it’s likely safe to assume they’re not measuring in pixels. So the reference likely means the number of bytes in a row + padding, the only alternative would be number of bytes without padding e.g. image width * bit depth.
This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.