LanguagesEasy Concurrency with Stackless Python

Easy Concurrency with Stackless Python

Developer.com content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Writing concurrent programs is hard. The concept is relatively straightforward; after all, we live concurrently. However, thinking concurrently in programming requires the programmer to anticipate lots of events that are never an issue in a single threaded, single process procedural program.

Python 2.5 provides two ways to write concurrent code: threads or forked processes. Both standard libraries have been available for years, and are stable and usable. That said, when interacting with the libraries, you have to pay a good amount of attention to detail, grabbing and releasing locks, communicating between threads, it can be a daunting task.

Stackless Python is an alternate distribution of Python that does away with the C call stack, and adds new “micro threading” libraries that provide a simpler interface to write threaded Python code.

The most vocal users of Stackless Python are the developers at CCP Games, makers of Eve Online, a massively multiplayer online spaceship combat game. In presentations at Pycon, they’ve attributed much of their scalability and developer agility to the speed and ease of use of Stackless. Their game cluster, written in Stackless Python, regularly handles 30,000+ users.

The Stackless concurrency model uses three concepts to provide its services, tasklets, a scheduler, and channels. You’ll cover each concept one at a time, and then tie them together in a small example of bees looking for flowers.

Tasklets, the Building Blocks of Micro-Threads

Tasklets are the fundamental unit of work in the Stackless world. A tasklet is an object that knows how to pause and resume itself (using the scheduler), and can send and receive messages to other tasklets (using channels).

For your demo code, you’ll start by writing a basic bee object that creates a tasklet for itself:

import stackless
import random

class Bee(object):
   """a bee, bees love nectar"""
   def __init__(self, id, flowers):
      self.nectar = 0
      self.id = id
      self.flowers = flowers
      stackless.tasklet(self.find_nectar)()

   def find_nectar(self):
      """my brain"""
      while self.nectar < 5:
         flower = random.choice(flowers)
         flower.channel.send(self)
         print "Bee %s got nectar from Flower %s I now have %s
                nectar" % (self.id, flower.id, self.nectar)

The bee object takes an ID (a number) and a list of flowers as arguments. It creates a tasklet, passing its own find_nectar method as the tasklet runner. The tasklet is added to the stackless scheduler, and is started and stopped according to the scheduler.

The Scheduler, an Ever-Changing Work Queue

The standard Stackless scheduler is a round robin queue. When a tasklet is added to the queue, it’s put on the bottom of the queue. When you call stackless.run() (which you’ll do later on in the example), the queue is executed in order. As each tasklet is executed, it can add itself back to the queue at the end of the queue. Other actions the tasklet takes, such as listening on a communication channel, “block” the tasklet, and allow the next tasklet to begin working.

In this example, Bees are buzzing around, picking flowers at random, and drawing nectar from them. In the find_nectar method, the bee chooses a random flower, and then sends itself to the flower via a channel call.

Channels, Easy to Use Communication Between Tasklets

Channels are like walkie talkies between tasklets. They allow any object to be sent to a receiving tasklet. Create your Flower object, and give it a channel to receive communications from your bee.

class Flower(object):
   """a flower, has nectar"""
   def __init__(self, id):
      self.id = id
      self.channel = stackless.channel()
      stackless.tasklet(self.bee_receiver)()

   def bee_receiver(self):
      """receives a bee"""
      while True:
         print "flower %s is listening" % self.id
         bee = self.channel.receive()
         bee.nectar = bee.nectar + 1
         print "Flower %s is giving Bee %s nectar" %
            (self.id, bee.id)

The flower class is relatively similar to the bee class. In the __init__ block, you add a channel listener, and then tell the scheduler to add the method ‘bee_receiver’ to the scheduler. ‘bee_receiver’ immediately blocks, waiting for a channel message from a bee. Because the bee sends itself as the message, you can access the bee’s methods and properties directly. The flower adds a nectar to the bee, and then prints out some diagnostic information. The loop continues, and then blocks while waiting for another message from a bee.

Finally, you need to initialize your objects, and get them to start talking to one another:

# let's start the world
flowers = []
# add 5 flowers
for id in range(5):
   flowers.append(Flower(id))

# add 5 bees
for id in range(5):
   bee = Bee(id, flowers)

stackless.run()

First, you initialize five flowers, and then five bees. As each bee or flower is initialized, it are added to the stackless scheduler through their call to stackless.tasklet in their __init__ method. So, at the end of the initialization, the stackless scheduler looks like this:

flower 0
flower 1
flower 2
flower 3
flower 4
bee 0
bee 1
bee 2
bee 3
bee 4

Then, you start running through the queue with the call to stackless.run(). First, the five flowers start listening on their channels. When the queue runs to bee 0, bee 0 picks a random flower from its list of pointers to the flower objects, and sends itself as a message. The receiving flower updates the bee’s nectar count, and then goes back to listening. The bee, after having passed its message, puts itself back on the end of the queue, by calling another random flower channel. It continues to do this until it’s filled up on nectar.

Each channel has its own queue that track messages from senders. As the channel manager (in this case, the flower) works through its queue, it doesn’t prevent new messages from coming in. In that way, no messages are lost, and no single message blocks the entire flow of the program.

Conclusion

Although this is obviously a trivial example, it covers all the essentials of what Stackless provides. With three simple tools, tasklets, channels, and a scheduler, you can build a wide array of concurrent programs.

About the Author

Chris McAvoy is a developer for PSC Group LLC in Chicago, Illinois. He specializes in Python and Ruby web applications. He also keeps a blog at http://weblog.lonelylion.com.

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Latest Posts

Related Stories