I know that you already have good answers, but SQL_VARIANT_PROPERTY I think this is misunderstood.
You use SQL_Variant_Property with a column, and then specify what you want from its metadata property. However, if it is null, it does not tell you much.
EG:
declare @Start date = getdate() , @End datetime= getdate() , @Int int = 1 , @DateNull date ; select sql_variant_property(@Start, 'BaseType') , sql_variant_property(@End, 'BaseType') , sql_variant_property(@Int, 'BaseType') , sql_variant_property(@DateNull, 'BaseType')
Will return three data types and zero. NULL processing is a big part of SQL. A lot of people, including me, sometimes want to process a zero with a value to represent it, and sometimes at other times they donβt care. SQL Variant ONLY WORKS for a populated value with an argument: "BaseType", otherwise it will return zero. As far as I know, this is because SQL says: "You have no data here, nothing needs to be defined to use memory."
Usually I will specify isnull (@thing, 0) when I work with integers, I explicitly want the dataset to include zeros for unknowns. In other cases, I probably want the user to know that the NULL occurrence was doing something else isnull (@thing, 'not present') for the report. At the same time, you can use coalesce to create a number of features (@thing, @otherthing, @yetotherthing, "unknown").
I think that in the code you are observing, someone converts something when I'm not sure where you really need to do this. The data type of the column in the table is supported by what it needs, and SQL will not store memory in it when it is NULL. Therefore, the need to change this seems arbitrary IMHO. I know that you can handle higher memory consumption when you expect more zeros with the SPARSE parameter that was introduced. I believe in SQL 2008. But I do not know when to throw something that consumes almost nothing to be more.