I have several Python generators that I want to combine into a new generator. I can easily do this with a hand-written generator using a bunch of yield .
On the other hand, the itertools module itertools designed for such things, and for me it seems that to create the generator I need is Putin's way of creating various iterators of this itertools module.
However, in this problem it soon becomes quite complicated (the generator must maintain some state - for example, are the first or later elements processed --- the i-th output additionally depends on the conditions of the i-th input element and various input lists should be processed differently before they are connected to the generated list.
As the composition of standard iterators that solve my problem, --- due to the one-dimensional nature of writing the source code --- it is almost incomprehensible, I wonder if there are any advantages to using standard itertools generators compared to manual generator functions (mainly in more complicated cases). In fact, I think that in 90% of cases, handwritten versions are much easier to read - probably because of their more imperative style compared to the functional style of iterator chains.
EDIT
To illustrate my problem, here is an example (toy): let a and b be two iterabilities of the same length (input). Elements a consist of integers, elements b are iterators themselves, whose individual elements are strings. The output should correspond to the output of the following generator function:
from itertools import * def generator(a, b): first = True for i, s in izip(a, b): if first: yield "First line" first = False else: yield "Some later line" if i == 0: yield "The parameter vanishes." else: yield "The parameter is:" yield i yield "The strings are:" comma = False for t in s: if comma: yield ',' else: comma = True yield t
If I write the same program in a functional style using generator expressions and itertools , I get something like:
from itertools import * def generator2(a, b): return (z for i, s, c in izip(a, b, count()) for y in (("First line" if c == 0 else "Some later line",), ("The parameter vanishes.",) if i == 0 else ("The parameter is:", i), ("The strings are:",), islice((x for t in s for x in (',', t)), 1, None)) for z in y)
Example
>>> a = (1, 0, 2), ("ab", "cd", "ef") >>> print([x for x in generator(a, b)]) ['First line', 'The parameter is:', 1, 'The strings are:', 'a', ',', 'b', 'Some later line', 'The parameter vanishes.', 'The strings are:', 'c', ',', 'd', 'Some later line', 'The parameter is:', 2, 'The strings are:', 'e', ',', 'f'] >>> print([x for x in generator2(a, b)]) ['First line', 'The parameter is:', 1, 'The strings are:', 'a', ',', 'b', 'Some later line', 'The parameter vanishes.', 'The strings are:', 'c', ',', 'd', 'Some later line', 'The parameter is:', 2, 'The strings are:', 'e', ',', 'f']
This may be more elegant than my first decision, but it looks like it's code written once-no-understand-later. I am wondering if this way of writing my generator has enough advantages for this to be done.
PS: I think part of my problem with a functional solution is that in order to minimize the number of keywords in Python, some keywords, such as "for", "if", and "else", were redesigned for use in expressions, so that their placement in the expression becomes familiar (the order in the generator expression z for x in a for y in x for z in y looks, at least to me, less natural than ordering in the classic for : for x in a: for y in x: for z in y: yield z loop for x in a: for y in x: for z in y: yield z ).