Chris Umbel

Easy Concurrency with Stackless Python

Since messing around with Erlang over the last couple months I've been very impressed with it's simplicity in managing concurrency. Erlang's message passing infrastructure really allows developers to have the locking just work without doing anything unnatural in code for synchronization. You almost let the problems define the locking behavior for you.

Such elegance isn't limited Erlang, oh no. Python's also been shown some concurrency-love by way of a specialized implementation called Stackless Python.

Stackless Python avoids using C's call stack and relies on lightweight microthreads rather than operating system threads or processes for concurrency. Channels (similar to Erlang's message passing) are used for synchronization and communication between tasklets (the implementation of microthreads).

Note that stackless python is a cooperative multitasking framework so you do have the responsibility of explicitly relinquishing control back to the thread scheduler.

See the small demonstration below.

#!/usr/bin/env python
import stackless

log_file = open('log.txt', 'w+')

# function to perform a CPU bound process
def worker(log_channel, address):
    while True:
        results = [perform CPU bound logic here]
        
        # yield to other threads
        stackless.schedule()
        
        results += [maybe do some more CPU bound logic here]

        # queue up a message containing the results
        # of the computation
        log_channel.send(results)
        
# function to log results of pings
def logger(log_channel):
    while True:
        # block until a message is recieved
        message = log_channel.receive()
        # write message to disk
        log_file.write(message)

# create a channel for inter-tasklet messaging
log_channel=stackless.channel()

# fire up two worker tasklets
worker1 = stackless.tasklet(worker)(log_channel)
worker2 = stackless.tasklet(worker)(log_channel)

# fire up a tasklet to recieve messages from the workers
listener = stackless.tasklet(logger)(log_channel)

# main loop
stackless.run()

log_file .close()

A lot like Erlang's threading, no? This is clearly simpler than python's native threading and has far less overhead.

Note that this stuff is used in real life by various video game deployments such as EVE Online and the Sylphis3d game engine. You now, the kind of people who really know concurrency.

Sat Oct 03 2009 21:11:00 GMT+0000 (UTC)

Follow Chris
RSS Feed
Twitter
Facebook
CodePlex
github
LinkedIn
Google