I have a taxi database with two datetime fields 'BookedDateTime' and 'PickupDateTime'. The client should know the average waiting time from the moment of booking a taxi to the moment when the driver actually “picked up” the client.
The database contains a bunch of rows spanning a couple of months.
The goal is to process the request, which shows me the daily average.
So a super simple example:
BookedDateTime | PickupDateTime
2014-06-09 12:48:00.000 2014-06-09 12:45:00.000
2014-06-09 12:52:00.000 2014-06-09 12:58:00.000
2014-06-10 20:23:00.000 2014-06-10 20:28:00.000
2014-06-10 22:13:00.000 2014-06-10 22:13:00.000
2014-06-09 ((-3 + 6) / 2) = average value 00: 03: 00.000 (3 minutes)
2014-06-10 ((5 + 0) / 2) = average value 00: 02: 30.000 (2.5 minutes)
Is this possible, or do I need to do some crunches in the code (i.e. C #)?
Any pointers would be greatly appreciated.