I use !objsize command to get the true value of an object. For example when I run the command below, it tells me that size of object at address 00000003a275f218 is 18 hex which translates to 24 in decimal.
0:000> !ObjSize 00000003a275f218
sizeof(00000003a275f218) = 24 (0x18) bytes
So far so good. I am running same command on an object and its size seem to have a discrepancy between hex and decimal.

So the size in hex is 0xafbde200. When I convert it to decimal using my calc, this comes to be 2948456960 whereas the output of command is showing decimal size to be -1346510336. Can someone help me understand why there is difference in sizes?
It's a bug in SOS. If you look at the source code, you'll find the method declared as
It uses the following format as output
As you can see it uses
%das the format specifier, which is for signed decimal integers. That should be%ufor unsigned decimal integers instead, since obviously you can't have objects using negative amount of memory.If you know how to use Git, you may provide a patch.
You can use
?in WinDbg to see the unsigned value: