Convering a 32 bit data into 8 bit data for 3D textures

54 Views Asked by At

I am trying to save a 3D DXGI_FORMAT_R8G8B8A8_UNORM Texture to a file, then load that texture back afterwards. This is working for my 3D DXGI_FORMAT_R32G32B32A32_FLOAT texture, and this saves and loads correctly. However this is not working for the 8bit texture, which is not loading all of the data, it loads partially and continual save/loads will result in a null texture eventually.

Both textures are 4x4x4, here are the first 4(out of 64) lines of the file when outputting the 32bit texture

44b5 54be 8098 453e b48d d4bd 0000 413f
f2ec d83e 574a a93e 468c 56bf 0080 633f
ee9c 823e 3c22 1b3f 9465 3a3f 0000 383f
6512 ecbd 1a02 3a3c 86cc e8bc 0000 b33e

And this loads back fine, I am using iostream and ofstream to read/write.

The RGBA8 texture looks similar, and fills in 16 lines - which is to be expected as that is a quarter of the data size.

But, when loading it appears to only partially load the data, resulting in partial completion of the texture. As the data is hex big-edian, I believe the partial loading is resulting in the incorrect colours.

I need a way to save out and load in the 8 bit data correctly.

I am reading each texel of the texture and saving it to file,

textureQueryResult.CopyToReadBuffer(deviceContext);
const void* result = textureQueryResult.OpenReadBuffer(deviceContext);
    
if (result)
{
std::ofstream ostrm(filename_utf8, std::ofstream::binary);
ostrm.write(reinterpret_cast<const char*>(result), bufferSize);
}
textureQueryResult.CloseReadBuffer(deviceContext);
return true;

And I can see the values are correct from the readbuffer when I cast the result to float, I can see all the values expected. But when saved out, it only appears to be saving a quarter of them - as if they are packed into the results - so I need to know how to unpack the result.

Here is part of my loading code, where I specify the format for each texture

int byteSize = ByteSizeOfFormatElement(dxgi_format); //4 for RGBA8, 16 for RGBA32
    tdesc.Width = tc->w; //4
    tdesc.Height =tc->l;// 4
    tdesc.Format = dxgi_format; //this is DXGI_FORMAT_R8G8B8A8_UNORM for my broken texture
    tdesc.Depth = tc->d; //4
    tdesc.MipLevels = tc->mips; //1
    dim = 3; 
    tdesc.Usage = D3D11_USAGE_DEFAULT;
    tdesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_UNORDERED_ACCESS;
    tdesc.CPUAccessFlags = 0;
    tdesc.MiscFlags = 0;

    std::ifstream istrm(pFilePathUtf8, std::istream::binary);
    //istrm.seekg(0, std::ios::end);
    long bufferLength = width * length * depth * byteSize; //ensuring buffer size is as expected
    //istrm.seekg(0, std::ios::beg);

    char* buffer = new char[bufferLength];

    istrm.read(buffer, bufferLength);
    
    D3D11_SUBRESOURCE_DATA srd; 
    srd.pSysMem = buffer;
    srd.SysMemPitch = tc->w* byteSize;
    srd.SysMemSlicePitch = tc->w*tc->l* byteSize;
    V_CHECK(CreateTexture3D(&tdesc,&srd, (ID3D11Texture3D**)(&texture)));
    delete[] buffer;

I think what I need is a way for my hexadecimal (big-edian) data to be converted to 8bit? Apologies for my rambling and confusion.

Thank you!

0

There are 0 best solutions below