Issues with color space and sRGB encoding: user error or implementation problem?

I am adding an option to my app to create an HDR framebuffer. From the list of formats returned by vkGetPhysicalDeviceSurfaceFormatsKHR, I selected one with

format = VK_FORMAT_R16G16B16A16_SFLOAT

and used the colorSpace indicated by the matching entry in the returned formats list

colorSpace = VK_COLOR_SPACE_SRGB_NONLINEAR_KHR

The rendering is very dark. It is clear that the fragment shader output is being encoded to sRGB, as would be expected given the stated colorSpace, but is not being decoded for display. At one point I inadvertently selected format VK_FORMAT_B8G8R8A8_UNORM which has the same colorSpace. I got the same very dark result. For SDR work I normally select VK_FORMAT_B8G8R8A8_SRGB which has the same colorSpace and everything looks fine.

For HDR I then tried VK_FORMAT_R16G16B16A16_SFLOAT and VK_COLOR_SPACE_DISPLAY_P3_LINEAR_EXT. The rendering was as expected.

It is my understanding that the colorSpace field of the returned formats list describes how the WSI/System will interpret the surface contents. That is obviously not happening. Am I doing something stupid or misunderstanding the spec or is this an implementation bug? I’m using MoltenVK.

In my code I set

swapchainCI.imageFormat = colorFormat;
swapchainCI.imageColorSpace = colorSpace;

where swapchainCI is the VkSwapchainCreateInfoKHR struct and colorFormat and colorSpace are the values taken from the formats list.

1 Like

That’s actually not how that works. The colorspace for the display is about how the data will be interpreted. But the format you use when rendering determines what the data is.

So your format says that it will just take the linear RGB values it is fed with no changes, but then the display colorspace says that it will interpret these values as sRGB.

If you want the display to use the sRGB colorspace, the format should do so as well. Also, if you want HDR output to the display, it’s best to use a linear colorspace for the display. sRGB just isn’t helping much there (especially since there is no floating-point format with sRGB conversion).

1 Like

Alfonse is correct.

Try format controls GPU sRGB conversion, colorSpace only says how the surface interprets pixels;

Verify (format,colorSpace) pair used, read presented swapchain image to prove conversion.

Pick a SRGB UNORM format for automatic OETF, manually encode in shader before present, or use float+linear HDR colorspace with VK_EXT_hdr_metadata for true HDR.


TL,DR, Mac Caveats, not fun. Below copied some some specs for our engines render device for Mac. Hope it helps your investigation and introspection.

  • MoltenVK → Metal mapping: MoltenVK maps Vulkan formats to MTLPixelFormat. Metal only does hardware sRGB conversion for UNORM sRGB pixel formats (e.g., BGRA8Unorm_sRGB); float formats will not get automatic sRGB OETF.

  • Verify mapping: log the Vulkan format returned by vkGetPhysicalDeviceSurfaceFormatsKHR and check MoltenVK’s format-mapping (or its runtime logs) to confirm the actual MTLPixelFormat.

  • CAMetalLayer / window: ensure the layer’s colorspace and extended-dynamic-range flag are set (enable EDR on the layer/window) — wrong layer settings cause compositor re-interpretation or tone‑mapping.

  • System/display HDR: macOS display HDR must be enabled and the display profile/support present; otherwise the compositor may clamp or tone‑map your output.

  • HDR metadata: check for and set VK_EXT_hdr_metadata (and confirm MoltenVK exposes it) so the compositor gets peak luminance/primaries and avoids implicit tonemapping.

  • Test isolation: run a minimal native Metal test that writes known linear values to verify whether the issue is MoltenVK or the OS compositor.

  • Blending/encode order: do linear-space blending in-shader and only encode to sRGB at final store/blit; encoding earlier breaks blending correctness.

  • Runtime checks: enumerate device extensions and surface formats at runtime (don’t assume); if *_SRGB UNORM isn’t offered, plan for manual OETF or a final blit to a UNORM_sRGB target.

Ahh! I was mistaken about the reason for the dark images. They happen when the FS output is not encoded to sRGB but is interpreted as if it is, so is decoded back to “linear”. Thanks for correcting my momentary lapse of reason - as Pink Floyd sang.

Thank you @p3nGu1nZz for this incredible information dump especially the tips about setting EDR and system/display HDR.

I have two further questions:

  1. Is there any guarantee that the implementation will convert from the selected colorSpace to that of the display being used? For example I selected a surface with colorSpace VK_COLOR_SPACE_DISPLAY_P3_LINEAR_EXT. If the user’s display is, say, AbobeRGB will the implementation do the necessary color transformation.
  2. If the answer to no. 1 is no, is there a way to query the display’s color space so an appropriate surface can be selected.

Having a single colorSpace enum that encompasses both the primaries and the transfer function is, I feel, unfortunate. It makes it harder to find one with a LINEAR transform. I’ll have to manually create a list of those with a LINEAR transform then check each found colorSpace against that list.

System/display HDR: macOS display HDR must be enabled and the display profile/support present; otherwise the compositor may clamp or tone‑map your output.

How do you enable this or check it is enabled? All I see in System Settings is either Preset or Color profile. My built in display preset is “Apple XDR Display” so that looks fine.

something like this

std::vector<VkExtensionProperties> GPUCapabilityDetector::get_supported_extensions(
    VkPhysicalDevice physical_device) {

    const auto& vk_functions = get_vulkan_global_functions();
    std::vector<VkExtensionProperties> extensions;
    if (!physical_device || !vk_functions.loaded) {
        return extensions;
    }

    uint32_t extension_count = 0;
    if (vk_functions.vkEnumerateDeviceExtensionProperties(physical_device, nullptr, &extension_count, nullptr) != VK_SUCCESS ||
        extension_count == 0) {
        return extensions;
    }

    extensions.resize(extension_count);
    if (vk_functions.vkEnumerateDeviceExtensionProperties(physical_device, nullptr, &extension_count, extensions.data()) != VK_SUCCESS) {
        return std::vector<VkExtensionProperties>{};
    }

    return extensions;
}

Thanks, but I was asking about enabling HDR on the macOS display, nothing to do with Vulkan as far as I can see. I am familiar with how to obtain the list of supported Vulkan extensions.

I am still looking for answers to the “two further questions” I asked.