deinterlacer in fragment shader?

hi,

this is my second post here… :stuck_out_tongue:

i have two textures being the even and odd fields of a video capture signal, and i want to deinterlace them using a fragment shader.

does anybody have any pointers on this. i suspect it’s been done before…

cheers

Depends on how you are rendering.

If you are rendering 1:1 (a fullscreen quad in a video resolution sized window) use gl_FragCoord.y as an indicator for odd or even lines. Now you only need to blend the two textures accordingly.
Actually there is no need for a shader here. Just blend the two textures, making sure there is the right offset in texture coordinates.

But, alot of other procedures can be called “deinterlacing”…

hi, yeah, i’m rendering a texture in a quad of indeterminate size in a window of indeterminate size.

i was hoping to use weave and bob simultaneously, and blend the result based on a difference between the two fields. a simple motion compensation i guess.

i’d like to work with absolute pixel offsets into the two source textures (video fields) in the vertex shader, and generate coordinates that would deliver interpolated woven or bobbed textures in the fragment shader, which i wld blend using some kind of motion detection value calculated from the fields. maybe as a varying from the vertex shader.

sound feasible?

I always work with separate fields so I never had to do this myself. But it sounds like you have a plan…

Deinterlacing and “indeterminate” size does not make much sense to me, though.

Why not go for progressive with double the framerate? Nothing looks better.

Everything depends on output. If you need deinterlaced output, then deinterlace source and use it. If you have to deliver interlaced output then use stencil buffer and render stuff twice just change stencil operation.
If your source and window size is different then you have to deinterlace frame in offscreen buffer (FBO) and then use it as usual.

There is a many ways to deinterlace frame but you have to chhose good one and implement in fragment shader.

One of the best deinterlacers is http://avisynth.org/mediawiki/TempGaussMC

def:
“indeterminate” size … yeah lol. the source texture is fixed size, but the rendered size is variable at run time, both in terms of the screen resolution and the quad size, so i want to deinterlace the source in the pipeline using absolute texel coordinates (GL_NEAREST?) and then let opengl scale it by interpolation. i’m not trying to deinterlace to actual screen pixels, just to the source texture prior to scaling.

i can use field rate sometimes, but i’m rendering to a proprietary display device which can’t always manage 50fps. the render context itself is actually a pbo.


yooyo:

“If your source and window size is different then you have to deinterlace frame in offscreen buffer (FBO) and then use it as usual.”

i’m suspecting there’s no way around this, or at least its probably the easiest way… i haven’t used fbo’s as yet (i’m pretty new to opengl) but i think i’ll take that approach. i was hoping there might be a way to do it all in one shader, mapping 2 separate field textures in the vertex shader to one virtual texture in the fragment shader by fiddling with gl_MultiTexCoord’s etc. and maybe pass a field delta value (motion compensation) as a varying, so the fragment shader can select between weave and bob.

“There is a many ways to deinterlace frame but you have to chhose good one and implement in fragment shader.”

yup. do you know of any published oglsl source for deintrlacing? i’m hoping to implement some convolution algorithms in shaders as well, so an algorithm like TempGaussMC might appeal.

cheers for all the ideas!