Asynchronous coroutine method chain

I want to implement a chain of methods, but not for ordinary functions - for asynchronous coroutines.

import asyncio class Browser: @asyncio.coroutine def go(self): # some actions return self @asyncio.coroutine def click(self): # some actions return self 

The "intuitive" way to call the chain will not work, because the only method returns a coroutine (generator), not self:

 @asyncio.coroutine def main(): br = yield from Browser().go().click() # this will fail loop = asyncio.get_event_loop() loop.run_until_complete(main()) 

The correct way to call the chain:

 br = yield from (yield from Browser().go()).click() 

But it looks ugly and becomes unreadable when the chain grows.

Is there any way to make this better? Any ideas are welcome.

+5
source share
2 answers

I created a solution that does the work that is close to necessary. The idea is to use a wrapper for Browser() , which uses __getattr__ and __call__ to collect an action (like getting an attribute or call) and return self to catch the next one action. After all the collected actions, we will β€œcatch” yiled from wrapper with __iter__ and process all the collected actions.

 import asyncio def chain(obj): """ Enables coroutines chain for obj. Usage: text = yield from chain(obj).go().click().attr Note: Returns not coroutine, but object that can be yield from. """ class Chain: _obj = obj _queue = [] # Collect getattr of call to queue: def __getattr__(self, name): Chain._queue.append({'type': 'getattr', 'name': name}) return self def __call__(self, *args, **kwargs): Chain._queue.append({'type': 'call', 'params': [args, kwargs]}) return self # On iter process queue: def __iter__(self): res = Chain._obj while Chain._queue: action = Chain._queue.pop(0) if action['type'] == 'getattr': res = getattr(res, action['name']) elif action['type'] == 'call': args, kwargs = action['params'] res = res(*args, **kwargs) if asyncio.iscoroutine(res): res = yield from res return res return Chain() 

Using:

 class Browser: @asyncio.coroutine def go(self): print('go') return self @asyncio.coroutine def click(self): print('click') return self def text(self): print('text') return 5 @asyncio.coroutine def main(): text = yield from chain(Browser()).go().click().go().text() print(text) loop = asyncio.get_event_loop() loop.run_until_complete(main()) 

Conclusion:

 go click go text 5 

Note that chain() does not return a real coroutine, but an object that can be used as a coroutine on yield from . We need to wrap the result of chain() to get a normal coroutine that can be passed to any asyncio function that requires a coroutine:

 @asyncio.coroutine def chain_to_coro(chain): return (yield from chain) @asyncio.coroutine def main(): ch = chain(Browser()).go().click().go().text() coro = chain_to_coro(ch) results = yield from asyncio.gather(*[coro], return_exceptions=True) print(results) 

Conclusion:

 go click go text [5] 
+3
source

This is still not very pretty, but you can implement the chain function, which will improve a bit:

 import asyncio @asyncio.coroutine def chain(obj, *funcs): for f, *args in funcs: meth = getattr(obj, f) # Look up the method on the object obj = yield from meth(*args) return obj class Browser: @asyncio.coroutine def go(self, x, y): return self @asyncio.coroutine def click(self): return self @asyncio.coroutine def main(): #br = yield from (yield from Browser().go(3, 4)).click() br = yield from chain(Browser(), ("go", 3, 4), ("click",)) loop = asyncio.get_event_loop() loop.run_until_complete(main()) 

The idea is to pass tuples in the format (method_name, arg1, arg2, argX) to the chain function, and not actually bind the method calls. You can simply pass method names directly if you don't need to support passing arguments to any of the methods in the chain.

+2
source

Source: https://habr.com/ru/post/1213402/


All Articles