The vickirk solution was not so bad, but it has problems with the time zone, which leads to the fact that you watched less than an hour.
I believe BST stands for British Daylight Saving Time , which is GMT +0100. Now java.util.Date and its descendants are working with the numbers of milliseconds since midnight January 1, 1970 GMT . The time zone is not counted until you specify a date / time using toString() . And they use your local time zone for this, which seems to be BST. This means what is really stored in these objects,
java.util.date: Mon May 09 23 : 00: 00 GMT 2011
java.sql.time: 02 : 58: 44 GMT
When you add the internal values ββ(which getTime() retrieves), as suggested by vickirk, you get a date containing Tue May 10 01 : 58: 44 GMT 2011, resulting in Tue May 10 02 : 58: 44 BST 2011 on gating.
So the explanation is one hour less is that the time zone offset is applied twice when you align the values ββseparately, while it only applies once after adding, because you are stringfy only once now. Or, on the other hand, adding an internal point value in time 03:58:44 BST is equivalent to adding a time interval of 2h 58m 44s.
So, to get the 3h 58m 44s time interval encoded in java.sql.Time, you must compensate for the time zone offset manually. You do this by analyzing the time line β00:00:00β with java.sql.Time, which will result in an internal value of -3600000 , which is equivalent to December 31, 1969 23:00:00 GMT, i.e. one hour before the era. This is a denial of the time zone offset.
public static Date mergeDate(Date date, Time time) { long tzoffset = -(Time.valueOf("00:00:00").getTime()); return new Date(date.getTime() + time.getTime() + tzoffset); }
Of course, all this dirty hacking is necessary, because you insist on interpreting the meaning of time as a time interval, while it really is a point in time.