C # DateTime subtracts time zones

I have this line of code:

double seconds = new DateTime(2006,7,6,12,1,0,DateTimeKind.Local).Subtract(new DateTime(1970,1,1,0,0,0,DateTimeKind.Local).TotalSeconds; 

This was the wrong number I wanted, so I tried the following:

 double seconds = new DateTime(2006,7,6,12,1,0,DateTimeKind.Local).Subtract(new DateTime(1970,1,1,0,0,0,DateTimeKind.Utc).TotalSeconds; 

(The difference is that in one case I use local time for an era, and in another - UTC). Interestingly, however, they both give me the same value, and I do not know why this is so. I live at -600 GMT, so DateTimeKind.Local really has to influence things.

Thanks in advance!

+6
timezone c # datetime
source share
2 answers

The MSDN DateTimeKind page ( http://msdn.microsoft.com/en-us/library/shx7s921.aspx ) says:

The members of the DateTimeKind enumeration are used in conversion operations between local time and coordinated universal time (UTC), but not in comparison or in arithmetic operations. For more information about time conversions, see Convert time between time zones.

The tip says use TimeZoneInfo.ConvertTimeToUtc

So, based on this, the code should probably be changed to:

 double seconds = new DateTime(2006,7,6,12,1,0,DateTimeKind.Local).Subtract(TimeZoneInfo.ConvertTimeToUtc(new DateTime(1970,1,1,0,0,0,DateTimeKind.Local)).TotalSeconds 
+5
source share

Try the following:

 namespace ConsoleApplication1 { using System; class Program { static void Main( string[] args ) { var laterDate = new DateTime( 2006, 7, 6, 12, 1, 0 ); var earlyDate = new DateTime( 1970, 1, 1, 0, 0, 0 ); var diff = laterDate.ToUniversalTime().Subtract( earlyDate.ToUniversalTime() ); var seconds = diff.TotalSeconds; } } } 
+1
source share

All Articles