I’m attempting to implement a water ripple effect on the iPhone with the following algorithm:
I’ve implemented this effect by mapping a background texture to a grid of vertices which are offset by the ‘height’ value
generated by the above algorithm. Unfortunately, for the effect to appear nicely the grid must match the background texture dimensions. In terms of performance this just isn’t feasible as a massive amount of poly’s are created (480x320 for a full screen iPhone app).
An alternate solution is to use the calculated ‘height’ value to serve as a XY pixel offset (as is done in the first example above). This is where I need help. How can I efficiently offset the background texture pixels with the calculated offsets? Is it possible to muck with the FrameBuffer assuming the background texture is in there? Alternatively I can read the texture pixels, and recreate another texture using the data and the pixel offset, however, I assume this method is extremely slow. Shaders came to mind, although we can’t use these with OpenGL ES 1.1 correct?
Is there a way to specify per-pixel UV offsets when rendering a texture in OpenGL ES 1.1? Is there an alternative way to achieve this effect?
Any help would be greatly appreciated!