Is there a way to pass 24bit Bayer tiles to the GPU using OpenGL?

211 Views Asked by At

I'm developing a camera interface in C#.

I have an HDR camera that generates 24-bit HDR raw images.
The raw image buffer is a byte array, byte[], with 3 bytes per each Bayer tile.

I'm looking for a way to pass this array buffer to the GPU as a texture using OpenTK (a C# wrapper of OpenGL). And then demosaic the the raw image pixels into 24-bit RGB pixels, then tonemap it to 8-bit RGB pixels.

I saw some example code that uses the Luminance pixel format:

GL.BindTexture(TextureTarget.Texture2D, this.handle);
GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Luminance, width, height, 0, PixelFormat.Luminance, PixelType.UnsignedByte, imageBuffer);
GL.BindTexture(TextureTarget.Texture2D, 0);

But I'm not sure how the fragment shader would sample this image buffer with this pixel format into a color pixel. Not sure if this is the right way to do it for a 24 bit pixel.

I also tried the option of creating an integer array (int[]) from the byte array image buffer, by combining every 3 bytes into an int on the CPU. But then I run into similar problems passing the integer array into the GPU as a texture. Not sure what pixel format should apply here.

Would anyone be able to point me in the right direction?

0

There are 0 best solutions below