gpus.js (webgl?) float32 issue

485 Views Asked by At

I'm probably missing something obvious, but i'm experimenting with gpu.js and getting some strange results. I just want to make sure i'm not doing something obviously stupid (which is likely).

Not sure if this is an issue with what i'm doing, or the way in which calculations are performed when done via gpu.js using WebGL.

I create a new GPU and new kernel:

const gpu = new GPU();
const test = gpu.createKernel(function () { 
    return 255 + 
        (255 * 256) + 
        (255 * 256 * 256) + 
        (255 * 256 * 256 * 256); 
}).setOutput([1]);

const res = test();

This gives me a result of 4294967296 (contained in a float32array).

If i run the same calculation from the console i get a result of 4294967295.

1

There are 1 best solutions below

2
On BEST ANSWER

An IEEE 754 single precision (32 bit) floating point value consists of 24 significant bits and 8 exponent bits.

4294967295 is 0xffffffff (integral) which can't be stored in a 32 bit float with the full accuracy, because it only has 24 significant bits.
4294967296 is 0x100000000 (integral) which can be stored in a 32bit float, because it is 0x4f800000 (floating point).

In compare, an IEEE 754 double precision (64 bit) floating point value consists of 53 significant bits and 11 exponent bits.
Therefore, a 64 bit floating point value, can store the value 4294967295 exactly (0x41efffffffe00000).