This is my first post. I work as a composer, but I make my own software and I am now digging into openGL with iOS as a primary concern.
I am looking to create a simple shader effect, but I’m not sure if the idea is technically feasible on iOS type hardware.
My aim is to simulate color bleeding (radiosity) in configurations of cubes, using texture maps that are calculating gradients based on the color, distance and angle from one cube to the next. The colors of the cubs is solid and known so rays are not needed to bounce of the cube, more I am projecting the colors of the nearest cubes surrounding each cube, or calculating points in relation to surrounding cubes from which to render gradients into a texture map.
There would likely be at most 20 cubes contributing to the texture map of each cube and as many as 500 cubes being rendered at a time.
Is this possible on iPhones etc, or would generating that many maps in realtime not work? I don’t have a feeling for what is possible on this kind of hardware.
Another approach that occurred to me, though I don’t know if opengl offers the tools, would be to render the cubes from an orthogonal camera facing the cubes, blur the image heavily and then either project blurred data onto the cubes or by some other method incorporate it as lighting and shadow.
Any thought on what functions I might look into for either of these solutions? Any better ideas? I am looking for any methods of faking ao and color bleed by taking advantage of the constraints of boxes and known solid colors.
Any advice would be greatly appreciated.