Structured Programming and Python Generators?

Update: What I really wanted was greenlets .


Note: This question changed a bit when people answered and made me “raise bets”, since my trivial examples had trivial simplifications; instead of continuing to mutate it here, I will reassure the question of when it will be clearer in my head, according to Alex's suggestion.


Python generators are a great thing, but how can I easily break it into modules (structured programming)? I really want PEP 380 or at least something comparable in syntax load, but in existing Python (e.g. 2.6)

As an example (true, stupid), do the following:

def sillyGenerator(): for i in xrange(10): yield i*i for i in xrange(12): yield i*i for i in xrange(8): yield i*i 

Being an ardent believer in DRY, I see a repeating picture here and expose it as a method:

 def quadraticRange(n): for i in xrange(n) yield i*i def sillyGenerator(): quadraticRange(10) quadraticRange(12) quadraticRange(8) 

... which of course does not work. The parent must call the new function in the loop, getting the results:

 def sillyGenerator(): for i in quadraticRange(10): yield i for i in quadraticRange(12): yield i for i in quadraticRange(8): yield i 

... which is even longer than before!

If I want to insert a part of a generator into a function, I always need this rather verbose two-line shell to call it. Deteriorates if I want to support send ():

 def sillyGeneratorRevisited(): g = subgenerator() v = None try: while True: v = yield g.send(v) catch StopIteration: pass if v < 4: # ... else: # ... 

And this is not considering the transition to exceptions. The same template every time! However, you cannot apply DRY and turn this identical code into a function, because ... you will need a template to call it! I want something like:

 def sillyGenerator(): yield from quadraticRange(10) yield from quadraticRange(12) yield from quadraticRange(8) def sillyGeneratorRevisited(): v = yield from subgenerator() if v < 4: # ... else: # ... 

Does anyone have a solution to this problem? I have a first attempt, but I would like to know what others have come up with. Ultimately, any solution will have to solve examples where the main generator executes complex logic based on the result of the data sent to the generator and potentially makes a very large number of calls for sub-generators: my precedent is the generators used to implement long-term, complex state machines.

+6
python generator
source share
8 answers

However, I would like to make my reuse criteria one step harder: what if I need a governance structure around my re-generation?

itertools often helps even there - you need to provide specific examples where you think it is not.

For example, I could call a subgenerator forever with different parameters.

itertools.chain.from_iterable .

Or my warders can be very expensive, and I just want to launch them when and when they reached.

Both chain and chain_from_iterable do this - no sub-iterator is started until the very moment when the first element from it is required.

Or (and this is a real desire) I may need to change what I do next depending on what my controller passes me using send ().

A particular example should be particularly appreciated. In any case, in the worst case, you will encode for x in blargh: yield x , where a paused Pep3080 will allow you to encode yield from blargh - about 4 additional characters (not tragedy yield from blargh .

And if some complicated version of coroutine of some itertools functions (itertools mainly supports iterators - there is no equivalent coroutools module there) becomes justified, since a certain structure of coroutine composition is often repeated in your code, then it is not too complicated code itself.

For example, suppose we often do something like: first we get a certain value; then, again, if we sent "foo", you will get the next element from fooiter, if "bla", from blater, if "zop", from zopiter, anything else, from the customer. As soon as we notice the second appearance of this compositional drawing, we can encode:

 def corou_chaiters(initsend, defiter, val2itermap): currentiter = iter([initsend]) while True: val = yield next(currentiter) currentiter = val2itermap(val, defiter) 

and call this simple compositional function as needed. If we need to compile other coroutines, rather than general iterators, we will have a slightly different composer using the send method instead of the next built-in function; etc.

If you can offer an example that is not easy to tame by such methods, I suggest you do this in a separate question (specially designed for coprocessor generators), since there is already a lot of material on it that will have little to do your other, much more complicated / complicated one, example.

+11
source share

You want to combine several iterators together:

 from itertools import chain def sillyGenerator(a,b,c): return chain(quadraticRange(a),quadraticRange(b),quadraticRange(c)) 
+6
source share

An impractical (unfortunately) answer:

 from __future__ import PEP0380 def sillyGenerator(): yield from quadraticRange(10) yield from quadraticRange(12) yield from quadraticRange(8) 

Potentially practical reference: Syntax for delegating to a subgenerator

Unfortunately, making this impractical: Moratorium in Python

UPDATE February 2011:

The moratorium has been removed, and PEP 380 is on the TODO list for Python 3.3. Hope this answer will be practical soon.

Read Guido's Note on comp.python.devel

+6
source share
 import itertools def quadraticRange(n): for i in xrange(n) yield i*i def sillyGenerator(): return itertools.chain( quadraticRange(10), quadraticRange(12), quadraticRange(8), ) def sillyGenerator2(): return itertools.chain.from_iterable( quadraticRange(n) for n in [10, 12, 8]) 

The latter is useful if you want one iterator to be exhausted before the start of another run (including its initialization code).

+3
source share

There is a Python Enhancement suggestion for providing yield from statement for "delegation of generation". Your example will be written as:

 def sillyGenerator(): sq = lambda i: i * i yield from map(sq, xrange(10)) yield from map(sq, xrange(12)) yield from map(sq, xrange(8)) 

Or better, in the spirit of DRY:

 def sillyGenerator(): for i in [10, 12, 8]: yield from quadraticRange(i) 

The proposal is in the draft status, and its possible inclusion is not certain, but it shows that other developers share their thoughts on generators.

+3
source share

For an arbitrary number of quadraticRange calls:

 from itertools import chain def sillyGenerator(*args): return chain(*map(quadraticRange, args)) 

This code uses map and itertools.chain . It takes an arbitrary number of arguments and passes them to quadraticRange . The resulting iterators are then chained.

+2
source share

There is a template, which I call the “generator core”, where the generators do not directly give the user, but some “kernel” loop, which processes (some) of their outputs as “system calls” with a special meaning.

You can apply it here with an intermediate function that accepts the generated generators and automatically expands them. To simplify its use, we will create this intermediate function in the decorator:

 import functools, types def flatten(values_or_generators): for x in values_or_generators: if isinstance(x, GeneratorType): for y in x: yield y else: yield x # Better name anyone? def subgenerator(g): """Decorator making ``yield <gen>`` mean ``yield from <gen>``.""" @functools.wraps(g) def flat_g(*args, **kw): return flatten(g(*args, **kw)) return flat_g 

and then you can simply write:

 def quadraticRange(n): for i in xrange(n) yield i*i @subgenerator def sillyGenerator(): yield quadraticRange(10) yield quadraticRange(12) yield quadraticRange(8) 

Note that the subgenerator () expands exactly one level of the hierarchy. You can easily make it multi-level (by controlling the manual stack or simply replacing the inner loop with for y in flatten(x): , but I think this is better, since every generator that wants to use this non-standard syntax should be explicitly wrapped using @subgenerator.

Note also that the detection of generators is imperfect! It will discover things written as generators, but these are implementation details. As the calling generator, you don't care that it returns an iterator. It could be a function returning some itertools object, and then this decorator will fail.

Checking that the object has the .next() method is too wide - you cannot get strings without .next() them. Thus, the most reliable way would be to check for some explicit marker, so you should write, for example:

 @subgenerator def sillyGenerator(): yield 'from', quadraticRange(10) yield 'from', quadraticRange(12) yield 'from', quadraticRange(8) 

Hey, it's almost like a PEP!

[credits: this answer gives a similar function - but it is deep (which I think is wrong) and not framed as a decorator]

+1
source share
 class Communicator: def __init__(self, inflow): self.outflow = None self.inflow = inflow 

Then follow these steps:

 c = Communicator(something) yield c response = c.outflow 

And instead of the template code, you can simply:

  for i in run(): something = i.inflow # ... i.outflow = value_to_return_back 

This is a fairly simple code that works without much brain.

0
source share

All Articles