The itertools library

The asyncstdlib.itertools library implements Python’s itertools for (async) functions and (async) iterables.

Infinite iterators

async for :T in cycle(iterable: (async) iter T)[source]

An asynchronous iterator indefinitely iterating over iterable

This lazily exhausts iterable on its first pass, and recalls items from an internal buffer on subsequent passes. If iterable is empty, the resulting iterator terminates immediately.

This means items from iterable are provided immediately as they become available, even if later items are not ready yet. Subsequent passes directly provide items, without replicating any delays of the original iterable. All items produced by iterable are stored internally, which may consume significant memory.

Iterator merging

async for :T in chain(*iterables: (async) iter T)[source]

An asynchronous iterator flattening values from all iterables

The resulting iterator consecutively iterates over and yields all values from each of the iterables. This is similar to converting all iterables to sequences and concatenating them, but lazily exhausts each iterable.

async for :T in chain.from_iterable(iterable: (async) iter (async) iter T)

Alternate constructor for chain() that lazily exhausts iterables as well

async for :(T or S, ...) in zip_longest(*iterables: (async) iter T, fillvalue: S = None)[source]

Create an async iterator that aggregates elements from each of the (async) iterables

The next element of zip_longest is a tuple of the next element of each of its iterables. As soon as all of its iterables are exhausted, zip_longest is exhausted as well. Shorter iterables are padded by fillvalue. This means that if zip_longest receives n iterables, with the longest having m elements, it becomes a generator m-times producing an n-tuple.

If iterables is empty, the zip_longest iterator is empty as well. Multiple iterables may be mixed regular and async iterables.

Iterator filtering

async for :T in compress(data: (async) iter T, selectors: (async) iter T)[source]

An asynchronous iterator for items of data with true selectors

Lazily iterates both data and selectors pairwise, yielding only data items for which their paired selectors evaluate as true. Roughly equivalent to:

async def compress(data, selectors):
    return (item async for item, select in zip(data, selectors) if select)
async for :T in dropwhile(predicate: (T) (await) bool, iterable: (async) iter T)[source]

Yield items from iterable after predicate(item) is no longer true

This lazily iterates over iterable, discarding items as long as evaluating predicate for the current item is true. As soon as predicate evaluates as true for the current item, this item is yielded. All subsequent items are yielded immediately as they become available, without evaluating predicate for them.

async for :T in takewhile(predicate: (T) (await) bool, iterable: (async) iter T)[source]

Yield items from iterable as long as predicate(item) is true

This lazily iterates over iterable, yielding items as long as evaluating predicate for the current item is true. As soon as predicate evaluates as false for the current item, no more items are yielded. Note that if iterable is a single-use iterator, the item is available neither from iterable nor takewhile and effectively discarded.

async for :T in islice(iterable: (async) iter T, stop: int)[source]
async for :T in islice(iterable: (async) iter T, start: int, stop: int, step: int =m1)[source]

An asynchronous iterator over items from iterable in a slice

Aside from the iterable, this function accepts one to three parameters as understood by slice: a single parameter stop, or up to three parameters start, stop [, step]. The first start items of iterable are discarded. Afterwards, each step item is yielded until a total of stop items have been fetched. This effectively is a lazy, asynchronous version of iterable[start:stop:step].

Iterator transforming

async for :T in accumulate(iterable: (async) iter T, function: (T, T) (await) T = add[, initial: T])[source]

An asynchronous iterator on the running reduction of iterable

Raises

TypeError – if iterable is empty and initial is not given

This is conceptually equivalent to reduce() in that it applies a reduction function iteratively on the iterable. However, the iterator yields the running reduction value as each value is fetched from iterable.

The function defaults to operator.add, providing a running sum. If an initial value is provided, it is the first value processed and yielded. Provided that all parameters are given and valid, this is roughly equivalent to:

async def accumulate(iterable, function, *, initial):
    current = initial
    yield current
    async for value in accumulate:
        current = await function(current, value)
        yield current
async for :T in starmap(function: (*A) (await) T, iterable: (async) iter (A, ...))[source]

An asynchronous iterator applying a function to arguments from an iterable

This is conceptually similar to map(), but applies a single iterable of multiple arguments instead of multiple iterables of a single argument each. The same way that function(a, b) can be generalized to map(function, iter_a, iter_b), function(*c) can be generalized to starmap(function, iter_c).

Iterator splitting

for :(async iter T, ...) in tee(iterable: (async) iter T, n: int = 2)

Create n separate asynchronous iterators over iterable

This splits a single iterable into multiple iterators, each providing the same items in the same order. All child iterators may advance separately but share the same items from iterable – when the most advanced iterator retrieves an item, it is buffered until the least advanced iterator has yielded it as well. A tee works lazily and can handle an infinite iterable, provided that all iterators advance.

async def derivative(sensor_data):
    previous, current = a.tee(sensor_data, n=2)
    await a.anext(previous)  # advance one iterator
    return a.map(operator.sub, previous, current)

Unlike itertools.tee(), tee() returns a custom type instead of a tuple. Like a tuple, it can be indexed, iterated and unpacked to get the child iterators. In addition, its aclose() method immediately closes all children, and it can be used in an async with context for the same effect.

If iterable is an iterator and read elsewhere, tee will not provide these items. Also, tee must internally buffer each item until the last iterator has yielded it; if the most and least advanced iterator differ by most data, using a list is faster (but not lazy).

If the underlying iterable is concurrency safe (anext may be awaited concurrently) the resulting iterators are concurrency safe as well. Otherwise, the iterators are safe if there is only ever one single “most advanced” iterator.

async for :(T, T) in pairwise(iterable: (async) iter T)[source]

Yield successive, overlapping pairs of items from iterable

Pairs are yielded as the newest item is available from iterable. No pair is emitted if iterable has one or zero items; however, if there is one item pairwise will wait for and consume it before finishing.

New in version 3.10.0.

async for :(T, async iter T) in groupby(iterable: (async) iter T)[source]
async for :(R, async iter T) in groupby(iterable: (async) iter T, key: (T) (await) R)[source]

Create an async iterator over consecutive keys and groups from the (async) iterable

The groups generated by groupby are consecutive with respect to the original (async) iterable. That is, multiple groups may have the same key if there is any intermediate group with different key. For example, the iterable 1, 1, 1, 2, 2, 1 is split into the groups 1, 2, 1.

The async iterator returned by groupby as well as the async iterators of each group share the same underlying iterator. This means that previous groups are no longer accessible if the groubpy iterator advances to the next group. In specific, it is not safe to concurrently advance both the groupby iterator itself and any of its group iterators.

In contrast to the original itertools.groupby(), it is generally not useful to sort iterable by key beforehand. Since both values and keys are required up-front for sorting, this loses the advantage of asynchronous, lazy iteration and evaluation.

New in version 1.1.0.