Specific Effects of using nvarbinary vs nvarchar for storing images

75 Views Asked by At

I have seen the statement that if you are storing images in a column in a SQL Server database, you should use varbinary instead of nvarchar. Which makes 100% sense to store binary data as varbinary and I do not question it.

But I have not seen any statements as to what problems might occur if you use nvarchar(max).

My dilemma is that I have inherited a database solution that is in production and storing images into a nvarchar column. For now it is working. And I'll need a little more than 'best practices' to be able to push this change through our development processes as a priority change. So I need something to point to as 'Bad things A, B and C will happen if we don't make this change.' But I have been unable to find such statements.

So does anyone have any concrete examples of bad things that have happened when storing images as nvarchar(max) instead of varbinary(max)?

So far the nvarchar(max) solution is working. I am looking for what could potentially go wrong if I do not switch it to varbinary(max).

1

There are 1 best solutions below

2
Martin Smith On

If the source binary data is an odd number of bytes then round tripping through nvarchar will add an extra 0x00 byte (Fiddle). This may be the most promising avenue to explore but if the solution "is working" I guess that this doesn't cause a problem for you for some reason (as either you only supply source data with even numbers of bytes or the additional byte doesn't cause an issue).

Apart from that the values should round trip within SQL Server unchanged for all 65,536 possible byte pairs. (Fiddle).

So the only issue would be if someone was to take the string, and change the text encoding of the string (which could happen very easily and implicitly if the string itself is being passed outside the database) and then convert that to binary. As the solution "is working" presumably no one is in fact doing that though.