Calculation of travel time using the available geographic information APIs for 5k + addresses

I am working on a transport model and am going to make a travel time matrix between 5,000 points. Is there a free, semi-reliable way to calculate the travel time between all my nodes?

I think google maps have a limit on the number of requests / hits that I can reach.

EDIT

I would like to use api, for example, Google maps or similar, such as data on roads, the number of lanes, speed, type of road, etc.

EDIT 2

Please keep in mind that open card data is incomplete and not available in all jurisdictions outside the United States.

+6
maps google-maps geolocation gis
source share
5 answers

The Google Directions API limits 2500 calls per day . In addition, the terms of service stipulate that you must use the service β€œin conjunction with displaying the results on a Google map”.

You might be interested in OpenTripPlanner , a development project that can perform multimodal routing, and Graphserver , on which OpenTripPlanner is built.

One approach would be to use OpenStreetMap data using Graphserver to generate the shortest path trees from each node.

+6
source share

Like all 12,502,500 full connections, I'm sure you will hit some kind of limit if you try to use Google maps for everyone. How accurate are the results you need / how far do you travel?

I could try to create a rough map with speeds on it (for example, mark interstate states as fast, yadda yadda), then use some software to calculate how long it takes from point to point. You could imagine it as a problem of electromagnetic fields, where you are trying to calculate the resistance from point to point above a plane with variable resistance (interstate wires, lakes - open circuits ...).

+1
source share

If you really need all these routes that are accurately calculated and stored in your database, it sounds like (and I suppose) that you have to spend money to get it. As you can imagine, this is expensive and there must be a reward.

I would investigate your problem a bit:

  • You really need all 5000! distances in the database? What if you asked Google for them as needed, and then cached them (if allowed). I had such web applications that due to the slow growth of traffic, I was able to use free services at an early stage to test this idea.
  • Do you really need all 5000 points? Or can you choose the top 100 and have a more acceptable problem?
  • Perhaps there is a hybrid where you store distances between large cities and make more estimates over shorter distances.

Again, I really don't know what your problem is, but maybe thinking a little outside the box will help you find an easier solution.

+1
source share

You may need to go for heuristics. Perhaps you can estimate travel time based on several factors, such as geometric distance and some features about starting and ending points (urban and rural areas, country, ...). You can get several distances, try to match your parameters on a subset of them and see how well you can predict others. My prediction would be, for example, that travel time approaches a linear relationship with distance, since in many cases the distance increases.

I know this is messy, but hey, you are trying to evaluate 12.5mio datapoints (or any amount :)

You can also gradually add knowledge from already received "real" times to your journey, finding close points to those you are looking for:

  • get the nearest StartApprox, EndApprox points to the starting and ending position, so that you have the transit time between StartApprox and EndApprox
  • calculate the distances StartError, EndError between start and StartApprox, end and EndApprox
  • if StartError + EndError> Distance (StartApprox, EndApprox) * 0.10 (or whatever your threshold is) β†’ calculate the distance via the API (and save it), otherwise use a known travel time plus overhead based on StartError + EndError

(if you have 100 addresses in NY and 100 in SF, all values ​​will be more or less the same (i.e. the difference between them is probably lower than the uncertainty associated with these predictions), and this approach does not allow you to issue 10,000 requests, where 1 will do)

+1
source share

Many GIS software packages have routing algorithms if you have data ... Transport data can be quite expensive.

There are several other source options for route planning. Is it something to do repeatedly or a one-time process? Could this be broken down into smaller subsets of points? Perhaps you can use several routing sources and segment data points into small enough for each routing mechanism.

Here are a few other Google quick search options: Wikipedia Route66 Cargo Miles

-one
source share

All Articles