I've come across a piece of code that I find suspicious, and I've seen it extensively used in most of the classes of a project.
It's a new and delete overload like this:
void* MyObject::operator new ( size_t size )
{
return ( size == 0 ? NULL : new char[size] ) ;
}
void* MyObject::operator new[] ( size_t size )
{
return ( size == 0 ? NULL : new char[size] ) ;
}
void MyObject::operator delete( void *p )
{
char* l_tmp = (char*)p;
delete[] l_tmp;
}
void MyObject::operator delete[]( void *p )
{
char* l_tmp = (char*)p;
delete[] l_tmp;
}
Can this be harmful in any way or form regarding memory, speed, or stability?
Does it make any sense?
In most standards compliant C++ settings,
newshould throwstd::bad_allocrather than returnnullptr. It's unclear without a lot more context whether changing this behavior for a specific class is actually desirable in your specific instance, or if the code was just written by someone who hates exceptions; however, the fact that the original author wroteNULLrather thannullptris circumstantial evidence marking it as "old school" in a way that makes me suspect the latter.