How to use bitwise operators in GLSL

68 Views Asked by At

So my issue is a bit weird, and I'm hoping I can get some help with it.

In the game I'm porting, it uses a boolean array to determine whether a tile is illuminated.

In the original source code, the game would apply a dithered pattern over areas that aren't illuminated/are out of the player's view: Old light system

What I originally wanted to do was send the boolean array directly to the shader, but upon learning this wouldn't be easily manageable, I figured I'd take another approach, which would be to convert the light values to a pair of integers via the following function:

public void rebuildLightmap() {
        boolean[] storage = new boolean[64];
        for(int z=0;z<SIZE;z++) {
            for(int x=0;x<SIZE;x++) {
                storage[z*SIZE+x]=this.dungeonFloor.isLit(x+(xPos * SIZE), z+(zPos * SIZE));
            }
        }
        
        for(int i=0;i<2;i++) {
            lightMap[i]=0;
            int len = 31;
            for(int j=0;j<32;j++) {
                if(storage[i * 32+j]) {
                    lightMap[i] += 1<<len--;
                }
            }
        }
    }

From there, I was planning on sending the lightmap integers to the shader as a pair of ints, then use bitwise operations to check the relevant bits and shade the tile accordingly.

However, when I tried compiling the shader, it told me that the '&' operand was undefined for integers in glsl, which goes against some examples I've seen on shadertoy and discussed by other people in the OpenGL community.

My question is how do I perform bitwise AND operations in GLSL shaders? I'm using Version 400 Core, if it helps.

As some additional info, I'm using a chunk-based system for rendering the tiles, where the map is broken up into 8x8 tile sections and loaded into a frame buffer, like so: Chunk

My main attempt was to build a lightmap out of a pair of integers, then pass the integers to GLSL, where I attempted to do the following:

  1. Figure out the "index" of the tile in the lightmaps through the equations (z*4+x).
  2. Bitshift the relevant lightmap by the index.
  3. Apply the Bitwise AND operation to this new value. This is where the issue arises, because GLSL throws an error with my usage of '&', claiming that the operand is unassigned.
1

There are 1 best solutions below

0
goober On

I'm unfamiliar with the intricacies of LWJGL (or even your program) but I think the crux of the algorithm is something like:

#extension GL_EXT_gpu_shader4 : enable

void rebuildLightmap() {
    int[] storage = new int[2];
    for(int z=0; z<SIZE; z++) {
        for(int x=0;x<SIZE;x++) {
            int index = z*SIZE+x;
            int result = this.dungeonFloor.isLit(x+(xPos * SIZE), z+(zPos * SIZE));
            storage[index & 31] ^= result << (index & 31);
        }
    }

    // Mostly unchanged
    for(int i=0;i<2;i++) {
        lightMap[i]=0;
        int len = 31;
        for(int j=0;j<32;j++) {
            if(storage[(i<<5)+j]) {
                lightMap[i] += 1<<len--;
            }
        }
    }
}

You just need to ensure dungeonFloor.isLit returns either 0 or 1.

The trick is instead of sending a full 64 bit integer, you send two 32 bit integers and from where you indexed before, perform a modulus which selects the lower or upper range of 64 whole bits.