Comparing images

Hi everyone!

I have an original rendering code written with OpenGL ES API, and I rewrote the program with Vulkan. I ran the same model and data on both programs, but because of the size of the final image and the different camera etc. settings in the code, I found it hard to compare the two images rendered by the two programs. Does anyone have some ideas about how to compare the two images? I want to make sure that my Vulkan program is doing the same thing as the OpenGL ES program.

Thank you for your help!

In general the light reflected by a surface depends on the direction of the observer (relative to the surface), the incoming light, and properties of the surface itself (its material if you like). So if you have two images from different viewpoints of a scene you can’t expect “corresponding pixels” to have the same color values. And that ignores the problem of even determining which pixel in one image corresponds to (one or more) pixel(s) in the other. If you know camera positions and orientations for both images you can do some triangulation to find corresponding points on the image plane, but as mentioned a pixel in one image can map to multiple ones in the other so that doesn’t fully solve the problem.

If you can arrange to take same size images from the same camera position and orientation in both applications, this becomes a much more tractable problem because you can subtract pixel values and see where (and by how much) your new implementation differs from your reference implementation.

You need to control the camera position. But once you sync that, this works well…

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.