Published June 18, 2025
I've known about and loved using Generator functions for a long time. At PyCon this year someone (and I don't remember who) explained to me what makes them tick.
Essentially what enables generators is that the Python stack is actually on the heap. So that means that when a generator
yields what it actually does is move it's stack frame off the stack as an O(1) operation. When it needs to resume it's once again an
O(1) operation to put it back on the stack. Calling a function however takes a fair amount more time as it needs to create that stack
frame.
But wait a second - I need to pass parameters into my function, how can I do that with a "running" generator? That's actually pretty
simple. The yield statement can actually both send a response (what it's yielding) as well as have a result. When we call the
generator's .send method the result of the yield is the parameter in the .send.
This allows for something like this:
from collections.abc import Generator
from random import randint
def my_generator() -> Generator:
""" Generator function that does something """
param1, param2 = yield
while True:
result = param1 + param2
print(f'{param1} + {param2} = {result}')
param1, param2 = yield result
my_gen = my_generator() # Initialize the generator. It hasn't actually done anything yet.
next(my_gen) # Prime the generator. It will do the first `yield` (before the `while` loop and then wait.
y = my_gen.send((1, 2)) # This will actually trigger the code in the `while` loop. `y` will be 3 at the end and it will print
# 1 + 2 = 3
for n in range(10):
y = my_gen.send((randint(1, 500), y))
The above code will give output like:
1 + 2 = 3
347 + 3 = 350
206 + 350 = 556
149 + 556 = 705
474 + 705 = 1179
153 + 1179 = 1332
205 + 1332 = 1537
14 + 1537 = 1551
58 + 1551 = 1609
428 + 1609 = 2037
280 + 2037 = 2317
Note that my_gen.send has a tuple in the parameters. This is because send takes a single parameter. That's not a big deal since
we can easily send a tuple or something else that will allow it to have multiple parameters.
This is the key point (and the one I actually learned today): since the initial call to my_geenrator actually invoked the function
and created the stack frame the subsequent send calls remove that overhead. So if we have a function that we need to call 100 times
we can significantly improve the time performance with this method at the cost of memory. Depending on the footprint of the
function it might not be a big deal but it is something to be aware of since when a generator yields it does not release the
references to variables in its scope - that's the whole point. So if your function will have large collections in its scope this
is probably not the best solution.
While this is a pretty silly example it could be useful if:
functools.cache.I'm sure that if I try hard enough I can find places to leverage this in my code but I would need to be actively looking for those places.
Source: Python Discourse Forums
More information on Generators: