AMD OpenGL and HDR display problem

Hi,
I have been using OpenGL to display HDR content following explanation from nvidia:

And it works great, but only on nVidia GPUs.

Using the same method:
Specify WGL_PIXEL_TYPE_ARB = WGL_TYPE_RGBA_FLOAT_ARB
with color depth 16 (WGL_RED_BITS_ARB = 16, WGL_GREEN_BITS_ARB = 16, WGL_BLUE_BITS_ARB = 16)

On AMD GPUs it displays SDR image.
That is to say it clamps the fragment shader values to 1.0, while on nvidia gpus it allows values to ~25.0 (or 10.000 nits as i understand it), and displays it correctly.
This is using the same TV (LG B9) and same OS (Wind 10).
Note that other apps, like Chrome displays HDR content correctly on AMD gpus, and directX tests apps also.

Tried bunch of different AMD GPUs, drivers settings texture formats, pixel types etc, with no luck.
Read thru whole https://gpuopen.com/ for clues, no luck.

Anyone have an idea or example how to create a proper OpenGL HDR Context/configuration?