How to continue query automation with Import.io

I successfully created the query using the Extractor tool found in Import.io. He does exactly what I want him to do, however I now need to run it once or twice a day. Is the purpose of Import.io as an API that allows me to build logic, such as data storage tasks and schedules (executing queries several times a day) with my own application, or are there ways for scheduled queries and using a long-term storage of my results completely in the service Import.io?

I am happy to create a Laravel or Rails application to make API requests and store information elsewhere, but if I reinvent the wheel by doing this and they provide tools to solve this issue, then this is currently a splash screen.

+4
source share
2 answers

Thanks for using the new forum! Yes, we moved it to Stack Overflow to maximize the atmosphere of the community.

Import is currently not able to plan crawls. However, this is what we are going to release in the near future.

At the moment, it is possible to set the Cron task, which will be executed when you specify.

+5
source

Another solution, if you are using the free version, is to use a CI tool, such as travis or jenkins, to schedule your API scripts. You can request live extractors, so you do not need to run them manually each time. This will consume one of your requests from your limit.

, , :

https://extraction.import.io/query/extractor/extractor_id?_apikey=apikey&url=url

, script , - import.io, , , - .

0

All Articles