How do you ensure background is cleared correctly in WebGPU?

585 Views Asked by At

Is there a magic flag I need to set to get correct clearing of the background when rendering with WebGPU? My render works fine except, whatever settings I use, I see the garbage that was last displayed is in the browser window instead of the the background color (if I set a non-zero clearColor in the attachment then it will keep accumulating the same color until it max'es out): enter image description here

I am using WebGPU via the emscripten CPP interface, and running Chrome Canary on Windows (Version 90.0.4407.0 (Official Build) canary (64-bit)).

My frame render looks like this (called from requestAnimationFrame on the js side):

WGPUSwapChain swapChain                 = _pWindow->swapChain();
WGPUTextureView backbufferView          = wgpuSwapChainGetCurrentTextureView(swapChain);
WGPURenderPassDescriptor renderpassInfo = {};
WGPURenderPassColorAttachmentDescriptor colorAttachment = {};
{
    colorAttachment.attachment            = backbufferView;
    colorAttachment.resolveTarget         = nullptr;
    colorAttachment.clearColor            = { 0.0f, 0.0f, 0.0f, 0.0f };
    colorAttachment.loadOp                = WGPULoadOp_Clear;
    colorAttachment.storeOp               = WGPUStoreOp_Store;
    renderpassInfo.colorAttachmentCount   = 1;
    renderpassInfo.colorAttachments       = &colorAttachment;
    renderpassInfo.depthStencilAttachment = nullptr;
}
WGPUCommandBuffer commands;
{
    WGPUCommandEncoder encoder = wgpuDeviceCreateCommandEncoder(_device, nullptr);

    WGPURenderPassEncoder pass = wgpuCommandEncoderBeginRenderPass(encoder, &renderpassInfo);
    wgpuRenderPassEncoderSetPipeline(pass, _pipeline);
    wgpuRenderPassEncoderSetVertexBuffer(pass, 0, _vb, 0, 0);
    wgpuRenderPassEncoderDraw(pass, 3, 1, 0, 0);
    wgpuRenderPassEncoderEndPass(pass);
    wgpuRenderPassEncoderRelease(pass);

    
    commands = wgpuCommandEncoderFinish(encoder, nullptr);
    wgpuCommandEncoderRelease(encoder);
}

wgpuQueueSubmit(_queue, 1, &commands);
wgpuCommandBufferRelease(commands);
wgpuTextureViewRelease(backbufferView);

The pipeline is setup with these settings:

WGPURenderPipelineDescriptor descriptor = {};
WGPUBlendDescriptor blendDescriptor           = {};
blendDescriptor.operation                     = WGPUBlendOperation_Add;
blendDescriptor.srcFactor                     = WGPUBlendFactor_One;
blendDescriptor.dstFactor                     = WGPUBlendFactor_Zero;
WGPUColorStateDescriptor colorStateDescriptor = {};
colorStateDescriptor.format                   = _colorFormat;
colorStateDescriptor.alphaBlend               = blendDescriptor;
colorStateDescriptor.colorBlend               = blendDescriptor;
colorStateDescriptor.writeMask                = WGPUColorWriteMask_All;

descriptor.colorStateCount = 1;
descriptor.colorStates     = &colorStateDescriptor;

Is there a setting I've missed or is this is a Canary bug?

1

There are 1 best solutions below

0
On

Had a zero alpha value, that was messing it up :( Changing it to one in the clearColor field works fine:

colorAttachment.clearColor            = { 0.0f, 0.0f, 0.0f, 1.0f };