I am trying to create a function that will load texture files with .DDS format. The problem seems to occur when my program tries to detect the FourCC format of the file. It does not return true neither with FOURCC_DXT1, neither with FOURCC_DXT3, neither with FOURCC_DXT5. I have tried many textures some of which are DXT1 according to the source which i took them from, however the problem still persists.
GLuint loadDDS(const char * imagepath) {
unsigned char header[124];
FILE *fp;
// open texture data
fopen_s(&fp, imagepath, "rb");
if (fp == NULL) return 0;
std::cout << "file: " << fp << '\n';
// get the surface desc
fread(&header, 124, 1, fp);
// header data
unsigned int height = *(unsigned int*)&(header[8]);
unsigned int width = *(unsigned int*)&(header[12]);
unsigned int linearSize = *(unsigned int*)&(header[16]);
unsigned int mipMapCount = *(unsigned int*)&(header[24]);
unsigned int fourCC = *(unsigned int*)&(header[80]);
// allocate buffer
data = (unsigned char*)malloc(width * height * 4);
// file data
unsigned char * buffer;
unsigned int bufsize;
// how big is it going to be including all mipmaps?
bufsize = mipMapCount > 1 ? linearSize * 2 : linearSize;
buffer = (unsigned char*)malloc(bufsize * sizeof(unsigned char));
fread(buffer, 1, bufsize, fp);
// close the file pointer
fclose(fp);
unsigned int components = (fourCC == FOURCC_DXT1) ? 3 : 4;
unsigned int format;
// here is where the problem occurs. Switch returns the default case.
//At this point fourCC has the value of 4.
switch (fourCC)
{
case FOURCC_DXT1:
format = GL_COMPRESSED_RGBA_S3TC_DXT1_EXT;
break;
case FOURCC_DXT3:
format = GL_COMPRESSED_RGBA_S3TC_DXT3_EXT;
break;
case FOURCC_DXT5:
format = GL_COMPRESSED_RGBA_S3TC_DXT5_EXT;
break;
default:
std::cout << "Error when trying to read header of texture. Could not find FOURCC component" << '\n';
free(data);
return 0;
}
// Create one OpenGL texture
GLuint textureID;
glGenTextures(1, &textureID);
// "Bind" the newly created texture : all future texture functions will modify this texture
glBindTexture(GL_TEXTURE_2D, textureID);
unsigned int blockSize = (format == GL_COMPRESSED_RGBA_S3TC_DXT1_EXT) ? 8 : 16;
unsigned int offset = 0;
// load the mipmaps
for (unsigned int level = 0; level < mipMapCount && (width || height); ++level)
{
unsigned int size = ((width + 3) / 4)*((height + 3) / 4)*blockSize;
glCompressedTexImage2D(GL_TEXTURE_2D, level, format, width, height,
0, size, data + offset);
offset += size;
width /= 2;
height /= 2;
}
free(data);
return textureID;
Too lazy to debug your code ... and as this type of questions are off topic here so do not handle this as an answer (more like comment but as comment this would be not readable nor possible).
Here is what I use for loading DDS into my OpenGL engine:
Its C++/VCL based so you can extract/port this to detect the fileformat ... I wrote this quite a long ago if I remember correctly the only stuff used from VCL is
Bitmap
,file access andAnsiString
. TheAnsiString
is self allocable string variable capable of string arithmetics. Beware its letters are accessed from1
. The bitmap use is described here:And file access is straightforward so porting it should not pose any problem.
So either port/use this or compare with your detection to locate the bug of yours... As yo can see I do not support all the formats but its a working start point.