I have two scripts: scraper.py and db_control.py. In scraper.py, I have something like this:
... def scrap(category, field, pages, search, use_proxy, proxy_file): ... loop = asyncio.get_event_loop() to_do = [ get_pages(url, params, conngen) for url in urls ] wait_coro = asyncio.wait(to_do) res, _ = loop.run_until_complete(wait_coro) ... loop.close() return [ x.result() for x in res ] ...
And in db_control.py:
from scraper import scrap ... while new < 15: data = scrap(category, field, pages, search, use_proxy, proxy_file) ... ...
Theoretically, the scraper should be started up unknown until a sufficient amount of data has been received. But when new not imidiatelly > 15 , then this error occurs:
File "/usr/lib/python3.4/asyncio/base_events.py", line 293, in run_until_complete self._check_closed() File "/usr/lib/python3.4/asyncio/base_events.py", line 265, in _check_closed raise RuntimeError('Event loop is closed') RuntimeError: Event loop is closed
But the scripts work fine if I run scrap () only once. So I think there are some problems with recreating loop = asyncio.get_event_loop() , I tried this one , but nothing has changed. How can i fix this? Of course, these are just fragments of my code, if you think that the problem may be in another place, the full code is available here .