Understanding MS SQL Server Date Types
-
01-11-2019 - |
Frage
Consider the following:
declare @dt datetime, @dt2 datetime2, @d date
set @dt = '2013-01-01'
set @dt2 = '2013-01-01'
set @d = '2013-01-01'
select convert(varbinary, @dt) as dt,
convert(varbinary, @dt2) as dt2,
convert(varbinary, @d) as d
Output:
dt dt2 d
------------------ -------------------- --------
0x0000A13900000000 0x07000000000094360B 0x94360B
Now, I already understand from the documentation that datetime
has a smaller range, and starts from 1753-01-01, while datetime2
and date
use 0001-01-01 as their start date.
What I don't understand though, is that datetime
appears to be little-endian while datetime2
and date
are big-endian. If that's the case, how can they even be properly sortable?
Consider if I want to know how many integer days are represented by a date
type. You would think you could do this:
declare @d date
set @d = '0001-01-31'
select cast(convert(varbinary, @d) as int)
But due to the endianness, you get 1966080 days!
To get the correct result of 30 days, you have to reverse it:
select cast(convert(varbinary,reverse(convert(varbinary, @d))) as int)
Or, of course you can do this:
select datediff(d,'0001-01-01', @d)
But that means internally somewhere it is reversing the bytes anyway.
So why did they switch endianness?
I only care because I'm working on a custom UDT in SQLCLR and the binary order of the bytes does seem to matter there, but these built-in types seem much more flexible. Does SQL Server have something internal where each type gets to provide it's own sorting algorithm? And if so, is there a way I can tap into that for my custom UDT?
See also, a related (but different) question on StackOverflow.
Keine korrekte Lösung