texture baking space

I’m trying to write a simple application for baking a texture from a paint buffer. Right now I have a mesh, a mesh texture, and a paint texture. When I render the mesh, the mesh shader will lookup the mesh texture and then based on the screen position of the fragment lookup the paint texture value. I then composite the paint lookup with the mesh lookup.

Here’s a screenshot with nothing in the paint buffer and just the mesh texture.

Here’s a screenshot with something in the paint buffer composited over the mesh texture.

So that all works great, but I’d like to bake the paint texture into my mesh texture. Right now I send the mesh’s UVs down as the position with an ortho set to (0,1)x(0,1) so I’m actually doing everything in texture space. The mesh texture lookup is also the position. The problem I’m having though is computing the screen space position of the fragment from the original projection to figure out where to sample the paint texture. I’m passing the bake shader my original camera project matrices and the object position to send the fragment shader the device-normalized position of the fragment (again from my original camera projection) to do the lookup, but it’s coming out funny.

Here’s what the bake texture is generating if I render half the output using the paint texture and screen position I’ve derived.

I would expect that block line to be right down the middle.

Am I calculating the screen position incorrectly in my vertex shader? Or am I going about this in a fundamentally wrong way?

edit: Wow, it really did an interesting job on my shader code. Here’s my code on pastie.org.

http://pastie.org/9292414