Except for the different amount of data it can store, what is the implementation / storage difference between varbinary(8000)
and varbinary(max)
in SQL Server 2019?
The official docs just say to use varbinary(max)
for column data with more than 8000 bytes, without going in any details, how e.g. a theoretical (if one could create it) varbinary(9999)
would be different from varbinary(max)
.