Using the bevy engine I have made an octree that I am serialising and writing to a buffer:
#[derive(Serialize)]
pub struct Octree {
pub root: [u32; 3],
pub width: u32,
pub leaves: Vec<Leaf>,
}
#[derive(Serialize)]
pub struct Leaf {
voxel: OctreeVoxel,
children: [u32; 8],
}
#[derive(Serialize)]
pub struct OctreeVoxel {
pub id: u32,
pub color: [u32; 3],
}
I am creating a buffer like this:
render_device.create_buffer(&wgpu::BufferDescriptor {
label: None,
size: 9000000,
usage: wgpu::BufferUsages::STORAGE
| wgpu::BufferUsages::COPY_SRC
| wgpu::BufferUsages::COPY_DST,
mapped_at_creation: false,
})
I am using bincode to serialise the data bincode::serialize(&octree).unwrap()
and writing to the buffer like this render_queue.write_buffer(&buffer, 0, data);
Here I am reading the data in gpu land:
let vox_photon = vec3<u32>(128 + 5, 128 + 5, 128 + 5);
var root = octree.root;
var width = octree.width;
var node = octree.leaves[0];
var next_index = 0u;
var exit = 0u;
while next_index != U32MAX && exit < 100 {
let i = get_leaf(root, vox_photon);
node = octree.leaves[next_index];
next_index = node.children[i];
root = get_new_root(i, root, width);
width = width / 2u;
exit += 1u;
}
if node.voxel.id != 0u {
let color = node.voxel.color;
return vec4<f32>(f32(color[0]), f32(color[1]), f32(color[2]), 255.0);
}
The unexpected behaviour is that the voxel id of the node it ends up with is u32::MAX while it should be 3, and the colour should be 0,255,0, but is 255,0,0
So to me it seems like there is some shift in the data? I am quite clueless as to what could cause that or how to fix it.
bincode was not serialising the data in a way the gpu can understand, this was the problem.
the way im doing things now that works is: