Our project uses images with bit depths higher than 8 bits, typically 10 bit. These are stored with 16bit PNGs, with P3 colorspace (so 1024 colors per channel).

I am trying to show these images in a browser using WebGL2. So far having no luck. I know Chrome can do it as I have some test images which reveal an extended colour range on my Macbook's retina screen (but not on an 8bit, external monitor).

Here's the test image: https://webkit.org/blog-files/color-gamut/Webkit-logo-P3.png (Source: https://webkit.org/blog/6682/improving-color-on-the-web/)

If you're using an 8 bit screen and hardware, the test image will look entirely red. If you have a high bit depth monitor, you'll see a faint webkit logo. Despite my high bit depth monitor showing the logo detail in Chrome, a WebGL quad with this texture applied looks flat red.

My research has shown that WebGL/OpenGL does offer support for floating point textures and high bit depth, at least when drawing to a render target.

What I want to achieve is simple, use a high bit depth texture in WebGL, applied to an on-screen quad. Here's how I am loading the texture:

var texture = gl.createTexture();
gl.activeTexture(gl.TEXTURE0 + 0);
gl.bindTexture(gl.TEXTURE_2D, texture);

// Store as a 16 bit float
var texInternalFormat = gl.RGBA16F;
var texFormat = gl.RGBA16F;
var texType = gl.FLOAT;

var image = new Image();
image.src = "10bitTest.png";
image.addEventListener('load', function() {
    gl.bindTexture(gl.TEXTURE_2D, texture);

This fails with

 WebGL: INVALID_ENUM: texImage2D: invalid format

If I change texFormat to gl.RBGA, it renders the quad, but plain red, without the extended colours.

I'm wondering if its possible at all, although Chrome can do it so I am still holding out hope.

  • 1
    Unless RENDERBUFFER_INTERNAL_FORMAT has more than 8 bits per color (see this and this), there is no chance of getting this to work.
    – lvella
    Jun 9, 2020 at 15:28
  • 1
    RGBA16F is not a valid format. The correct format for a internal format of RGBA16F is RGBA (as you already noticed). The format only specifies the incoming data (image), but has nothing to do with how the texture is stored or rendered.
    – BDL
    Jun 9, 2020 at 15:29
  • @Ivella isnt RENDERBUFFER_INTERNAL_FORMAT for rendering to textures? I am just trying to render to the screen
    – sipickles
    Jun 9, 2020 at 16:39
  • Note: "extended colours" (as you write) has nothing to do with higher bits (with higher bits we tend to have extended colours, but we used for many years DCI-P3 or AdobeRGB "8-bit data" screens (hardware calibrated, so panels could do much more, like modern good 10bits screens have 14 or 16 bits capabilities) ; so these are orthogonal things). So I'm not sure you are looking the right part of the problem. Jun 10, 2020 at 7:08
  • @sipi did you find a workaround for this? By the way, here's the official list of possible combinations of texture internal format, format and type: khronos.org/registry/webgl/specs/latest/2.0/…
    – Fappaz
    Sep 4, 2020 at 3:36

2 Answers 2



  1. You can not (as of June 2020) create a canvas that is more than 8bits per channel in any browser. There are proposals but none have shipped

  2. You can not load > 8bit per channel images into WebGL via img or ImageBitmap. There are no tests that data > 8bits makes in into the textures.

You can load a > 8bit per channel image into a texture if you parse and load the image yourself in JavaScript but then you fall back to problem #1 which is you can not display it except to draw it into an 8bit per channel canvas. You coud pull the data back out into JavaScript, generate a 16bit image blob, get a URL for the blob, add an img tag using that URL and pray the browser supports drawing it with > 8bits per channel.

  • You said "You can load a > 8bit per channel image into a texture if you parse and load the image yourself in JavaScript"... do you know how this could be done to a frame of a video stream?
    – Fappaz
    Sep 4, 2020 at 3:39
  • Also, in regards to loading high-depth images into WebGL, isn't this what this Chromium issue (fixed in 2019) is supposed to support? bugs.chromium.org/p/chromium/issues/detail?id=624436#c76
    – Fappaz
    Sep 4, 2020 at 3:41
  • 1
    They hacked that in AFAIK. It's not part of the spec. There are tests and AFAIK they don't test 16bit images. What formats a browser supports is up to the browser. For example Safari doesn't support webm or webp (9/2020). And second, if there isn't a conformance test you can count on it not working. The conformance tests test for example an 8bit png's binary data makes it into a texture perfectly, even though by default the browser might color correct. Other formats are not tested AFAIK for binary equality.
    – gman
    Sep 4, 2020 at 8:42
  • 1
    So, while you can upload any image and ask for FLOAT data you can't be sure the data wasn't upscaled from 8 bits and lossy. That is unless they add a conformance test that requires it. The best you could do is test. Make a 1 to 4 pixels float image, upload it as a FLOAT to texture. Render and read the results in such a way that you can tell if data was lost. Same with 16bit images.
    – gman
    Sep 4, 2020 at 8:42
  • thanks for clarifying it. CG is not really my field of expertise so I'm struggling to read temperature from a thermal camera, which streams frames in 16-bit images. With the popularity of high-depth devices nowadays, it was quite the surprise to find out that browsers are still stuck with 8 bits per channel.
    – Fappaz
    Sep 5, 2020 at 22:47

For the record, this is not currently possible. Recent activity looks promising:


Your Answer

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

Not the answer you're looking for? Browse other questions tagged or ask your own question.