Coding vertex position to texture

Hi,

I want to encode vertex position into texture (Iam doing it in shader). So I’ll have texture with position of vertices that are on the screen.

I know that when I code normal, I do it like this:
Color.xyz = (normal + float3(1.0,1.0,1.0)) / 2.0;

and it works fine…but what about the positions?
is this sufficient:
oColor2 = position;

I think not, because vertex position is not within [0,1], but I can’t think of way how to do this.

I’d encode positions into RGBA_16F or RGBA_32F texture formats (floating point formats allow values outside [0,1]).

Alternatively, if you still want to use a fixed point format, you can pass mesh bounding box into the shader and scale vertex positions into [0,1] according to it.

for my small deferred renderer I use RGBA_16f and that works just fine. I think encoding it to fixed size buffers would not be accurate enough and quite painful to handle.

DmitryM and mokafolio are correct. In case you want to make a deferred shading renderer (or something like this) consider not to store the vertex positions in a texture. In my project I calculate the vertex position in eye space dynamically using the depth texture (and a few other things). I found that its faster that way.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.