Best way to register API calls, per minute / hour

We use reverse geocoding in the rails web service and encounter quota issues when using Google geocoding with a geokit. We are also introducing a simple-geo service, and I want to track how many requests per minute / hour we make.

Any suggestions for tracking reverse geocoding calls?

Our code would look something like this: would you do it?

  • Add a custom log and process in the background daily
  • Use a super-fantastic stone that I don’t know about, which makes quotas and rating easy.
  • Insert a call into the database and execute the queries there.

Note. I do not need real-time data, I just want to know during the hourly period what our usual and maximum requests are per hour. (and the total number of monthly requests)

def use_simplegeo(lat, lng) SimpleGeo::Client.set_credentials(SIMPLE_GEO_OAUTHTOKEN, SIMPLE_GEO_OAUTHSECRET) # maybe do logging/tracking here? nearby_address = SimpleGeo::Client.get_nearby_address(lat, lng) located_location = LocatedLocation.new located_location.city = nearby_address[:place_name] located_location.county = nearby_address[:county_name] located_location.state = nearby_address[:state_code] located_location.country = nearby_address[:country] return located_location end 

Thanks!

+6
ruby-on-rails reverse-geocoding
source share
1 answer

The first part here does not answer the question that you ask, but I will be useful if I have not thought about it before.

You looked at the fact that you are not doing your reverse geocoding using your server (i.e. through Geokit), but instead it is done by the client? In other words, some Javascript is loaded into the user browser, creating the Google geocoding API on behalf of your service.

If your application can support this approach, this has several advantages:

  • You are faced with a quota problem because your distributed users have their daily quota and do not consume yours.
  • You do not spend your own server resources.

If you still want to log geocoder requests and are concerned about the performance impact on your main application database, you can consider one of the following options:

  • Just create a separate database (or databases) for logging (which writes intensively) and do it synchronously. May be relational, but maybe MongoDB or Redis may work either
  • Enter the file system (with a custom registrar), and then cron them into packages in a structured, requested repository later. Storage can be external, for example, on Amazon S3, if it works better.
  • Just write an entry in SimpleGeo every time you make a Geocode, and add your own metadata to these entries to bind them to your own model.
+2
source share

All Articles