I’m not sure if this is the right place to post…
I’ve come across some strange behaviour on ATi cards. If the resolution of a CLImage2D is exactly 960x540 and the format is CL_UNSIGNED_INT8 or CL_UNORM_INT8 the texture reads return wrong color values. On other resolutions it works as expected.
This behaviour was reproducible on a FirePro V7900 and a HD6900.
On a nvidia GT555M and a GT 630 the output is as expected. The operating system was Win 7 (64bit) and the application was compiled in 32bit mode. Newest ATi drivers and APP SDK were used.
Could this be a driver bug?
Please let me know if you need more informations.
Thanks in advance
The example can be found here: OpenCL Texture Read Example - Pastebin.com
In this example the kernel only reads from the texture and writes the output to a float buffer. I hope I don’t do something wrong here…