My understanding of yield from is that it is similar to yielding every item from an iterable. Yet, I observe the different behavior in the following example.
I have Class1
class Class1:
def __init__(self, gen):
self.gen = gen
def __iter__(self):
for el in self.gen:
yield el
and Class2 that different only in replacing yield in for loop with yield from
class Class2:
def __init__(self, gen):
self.gen = gen
def __iter__(self):
yield from self.gen
The code below reads the first element from an instance of a given class and then reads the rest in a for loop:
a = Class1((i for i in range(3)))
print(next(iter(a)))
for el in iter(a):
print(el)
This produces different outputs for Class1 and Class2. For Class1 the output is
0
1
2
and for Class2 the output is
0
What is the mechanism behind yield from that produces different behavior?
What Happened?
When you use
next(iter(instance_of_Class2)),iter()calls.close()on the inner generator when it (the iterator, not the generator!) goes out of scope (and is deleted), while withClass1,iter()only closes its instanceThis behavior is described in PEP 342 in two parts
.close()method (well, new to Python 2.5)What happens is a little clearer (if perhaps surprising) when multiple generator delegations occur; only the generator being delegated is closed when its wrapping
iteris deletedFixing
Class2What options are there to make
Class2behave more the way you expect?Notably, other strategies, though they don't have the visually pleasing sugar of
yield fromor some of its potential benefits gives you a way to interact with the values, which seems like a primary benefitif you don't interact with the generator and don't intend to keep a reference to the iterator, why bother wrapping it at all? (see above comment about interacting)
while testing this, it plausibly highlights some
iter()inconsistency - see comments below (ie. why isn'teclosed?)also an opportunity to pass multiple generators with
itertools.chain.from_iterableHunting the Mystery
A better clue is that if you directly try again,
next(iter(instance))raisesStopIteration, indicating the generator is permanently closed (either through exhaustion or.close()), and why iterating over it with aforloop yields no more valuesHowever, if we name the iterator, it works as expected
To me, this suggests that when the iterator doesn't have a name, it calls
.close()when it goes out of scopeDisassembling the result, we find the internals are a little different
Notably, the
yield fromversion hasGET_YIELD_FROM_ITER(subtly,
YIELD_FROMkeyword appears to be removed in 3.11)So if the given iterable (to the class) is a generator iterator, it'll be handed off directly, giving the result we (might) expect
Extras
Passing an iterator which isn't a generator (
iter()creates a new iterator each time in both cases)Expressly closing
Class1's internal generatorgenerator is only closed by
iterwhen deleted if instance is popped