[OS X, CoreImage]How to bind a CIImage to an openCL image2d?


I guess a lot of you are coding on a Mac, so I’ve got a simple question for you.

I’ve got a CIImage (CoreImage framework) which is already on the GPU memory. So far, I transfer the CIImage into a NSBitmapImageRep on the CPU memory before resending back to the GPU memory in an OpenCL image2d.
Yes, it sounds stupid but I don’t know how to do otherwise :slight_smile:
How to create (or bind) an OpenCL image2d directly from a CIImage?

I know that a CIImage can be transformed into an openGL texture, an openGL texture can be bound with a IOSurface and then an IOSurface can be bound to an openCL image2d.
Has anyone already tried to do something like that or have a pointer to some sample code?

Thanks for your help!


I would recommend the following:

  1. First of all, you need to create a CL context from a CGLShareGroup. You can get the CGLShareGroup by doing:

CGLShareGroupObj sharegroup = CGLGetShareGroup((CGLContextObj)[[NSOpenGLContext currentContext] CGLContextObj]);

cl_context_properties akProperties[] = { CL_CONTEXT_PROPERTY_USE_CGL_SHAREGROUP_APPLE, (cl_context_properties)sharegroup, 0 };

context = clCreateContext(akProperties, 0, 0, clLogMessagesToStdoutAPPLE, 0, &err);

  1. Then create the CIImage from a GL texture.

  2. Then create the CL image from the GL texture using the CL API call clCreateImage2D (assuming it is a 2D texture).

If you want to use IOSurfaces, then you can use the clCreateImageFromIOSurface2DAPPLE API in Lion. This creates a CL image from an IOsurface.

Thanks Affie,

What you propose works well :slight_smile:
Since I wanted to convert the CIImage in OpenCL, first I rendered the CIImage in the OpenGL texture (using FBO) and then I created my CIImage from the OpenGL texture.
It seems that the whole process is about 2ms which is very good.

I could not solve this without your help.
Thank you very much.