I am looking for an elegant and hopefully bevy-esque way of rendering to a wgpu::Texture
. The reason is that I'm am implementing a WebXR
libary and the WebXRFramebuffer
must be rendered to in immersive XR.
let framebuffer = //get framebuffer from web_sys
let texture: wgpu::Texture = unsafe {
device.create_texture_from_hal::<wgpu_hal::gles::Api>(
wgpu_hal::gles::Texture {
inner: wgpu_hal::gles::TextureInner::ExternalFramebuffer {
inner: framebuffer,
...
My question is, once I have created this wgpu::Texture
is there a way to either:
- Set it as the main pass texture of the bevy engine
- Render the cameras to a
bevy::Image
and blit that to thewgpu::texture
I've seen examples like the superconductor engine doing a lot of low level wgpu work to achieve this but it feels like there should be a simpler way with recent bevy features like the render graph and camera render targets.
Rendering to a
TextureView
is supported as of Bevy0.11
, see the pr for more info.here's an exerpt from my
web_xr
crate that sets theTextureView
from agl_layer
as a target that cameras can render to. Look here for creating the camera and here for applying the texture view.