I'm at a dead end. It makes no sense to me. The following code:
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");
long lFirst = 1383460981000L;
long lSecond = 1383464581000L;
System.out.println(lFirst);
System.out.println(lSecond);
java.util.Date first = new Date(lFirst);
java.util.Date second = new Date(lSecond);
System.out.println(sdf.format(first));
System.out.println(sdf.format(second));
System.out.println(first.getTime());
System.out.println(second.getTime());
System.out.println("Diff" + (first.getTime() - second.getTime()));
System.out.println("Hours diff: " + (((float)(second.getTime()-first.getTime()))/1000f/60f/60f));
outputs the following result:
1383460981000
1383464581000
2013-11-03 01:43:01.000
2013-11-03 01:43:01.000
1383460981000
1383464581000
Diff-3600000
Hours diff: 1.0
How do these two different long values produce an exact date? I came across this while moving data from one type of database to another and checking the results. I could not understand the validation errors that I saw, so I created this small piece of code to compare the values and am pretty sure. Although I am willing to agree that there is some time zoning in my databases, this would not be a problem in this code example.
source
share