Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Coroutines In Python

News Scripting Languages

Best Python books for system administrators

Recommended Links Python for Perl programmers Debugging in Python Python Braces Debate
 Python installation Installing Python 3 from RPMs Installing Python Packages Compiling Python from source   Python coroutines Generators as semi-coroutines
Perl to Python functions map Control Flow Statements Execution of commands and capturing output of shell scripts and pipelines Programming environment Jython Python IDEs Pycharm IDE
Command-Line Syntax and Options pdb — The Python Debugger Quotes Python history  Tutorials Etc  

Now you might be thinking how coroutine is different from threads, both seems to do same job. In case of threads, it’s operating system (or run time environment) that switches between threads according to the scheduler. While in case of coroutine, it’s the programmer and programming language which decides when to switch coroutines. Coroutines work cooperatively multi task by suspending and resuming at set points by programmer.

Exception handling can also be viewed as a case of "frozen coroutines" is exception occurs, coroutine is released.

The new syntax is 'yield from' ( PEP 380 ) and it allows true coroutines in Python >3.3

David Beazley published several presentations devoted to Python generators and coroutines:

There are concurrency examples in Chapter 18 of Fluent Python. Here is quote form Fluent Python, chapter 16:

The infrastructure for coroutines appeared in PEP 342 — Coroutines via Enhanced Generators, implemented in Python 2.5 (2006): since then, the yield keyword can be used in an expression, and the .send(value) method was added to the generator API. Using .send(…), the caller of the generator can post data that then becomes the value of the yield expression inside the generator function. This allows a generator to be used as a coroutine: a procedure that collaborates with the caller, yielding and receiving values from the caller.

In addition to .send(…), PEP 342 also added .throw(…) and .close() methods that respectively allow the caller to throw an exception to be handled inside the generator, and to terminate it. These features are covered in the next section and in “Coroutine Termination and Exception Handling”.

The latest evolutionary step for coroutines came with PEP 380 - Syntax for Delegating to a Subgenerator, implemented in Python 3.3 (2012). PEP 380 made two syntax changes to generator functions, to make them more useful as coroutines:

These latest changes will be addressed in “Returning a Value from a Coroutine” and “Using yield from”.

Let’s follow the established tradition of Fluent Python and start with some very basic facts and examples, then move into increasingly mind-bending features.

Basic Behavior of a Generator Used as a Coroutine

Example 16-1  illustrates the behavior of a coroutine.

Example 16-1. Simplest possible demonstration of coroutine in action
>>> def simple_coroutine():  
...     print('-> coroutine started')
...     x = yield  
...     print('-> coroutine received:', x)
...
>>> my_coro = simple_coroutine()
>>> my_coro  
<generator object simple_coroutine at 0x100c2be10>
>>> next(my_coro)  
-> coroutine started
>>> my_coro.send(42)  
-> coroutine received: 42
Traceback (most recent call last):  
  ...
StopIteration
A coroutine is defined as a generator function: with yield in its body.
 
yield is used in an expression; when the coroutine is designed just to receive data from the client it yields None—this is implicit because there is no expression to the right of the yield keyword.
As usual with generators, you call the function to get a generator object back.
The first call is next(…) because the generator hasn’t started so it’s not waiting in a yield and we can’t send it any data initially.
This call makes the yield in the coroutine body evaluate to 42; now the coroutine resumes and runs until the next yield or termination.
In this case, control flows off the end of the coroutine body, which prompts the generator machinery to raise StopIteration, as usual.

A coroutine can be in one of four states. You can determine the current state using the inspect.getgeneratorstate() function, which returns one of these strings:

'GEN_CREATED'
Waiting to start execution.
'GEN_RUNNING'
Currently being executed by the interpreter.1
'GEN_SUSPENDED'
Currently suspended at a yield expression.
'GEN_CLOSED'
Execution has completed
Note the error message: it’s quite clear.

The initial call next(my_coro) is often described as “priming” the coroutine (i.e., advancing it to the first yield to make it ready for use as a live coroutine).

To get a better feel for the behavior of a coroutine, an example that yields more than once is useful. See Example 16-2.

Example 16-2. A coroutine that yields twice
>>> def simple_coro2(a):
...     print('-> Started: a =', a)
...     b = yield a
...     print('-> Received: b =', b)
...     c = yield a + b
...     print('-> Received: c =', c)
...
>>> my_coro2 = simple_coro2(14)
>>> from inspect import getgeneratorstate
>>> getgeneratorstate(my_coro2)  
'GEN_CREATED'
>>> next(my_coro2)  
-> Started: a = 14
14
>>> getgeneratorstate(my_coro2)  
'GEN_SUSPENDED'
>>> my_coro2.send(28)  
-> Received: b = 28
42
>>> my_coro2.send(99)  
-> Received: c = 99
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
StopIteration
>>> getgeneratorstate(my_coro2)  
'GEN_CLOSED'
inspect.getgeneratorstate reports GEN_CREATED (i.e., the coroutine has not started).
Advance coroutine to first yield, printing -> Started: a = 14 message then yielding value of a and suspending to wait for value to be assigned to b.
getgeneratorstate reports GEN_SUSPENDED (i.e., the coroutine is paused at a yield expression).
Send number 28 to suspended coroutine; the yield expression evaluates to 28 and that number is bound to b. The -> Received: b = 28 message is displayed, the value of a + b is yielded (42), and the coroutine is suspended waiting for the value to be assigned to c.
Send number 99 to suspended coroutine; the yield expression evaluates to 99 the number is bound to c. The -> Received: c = 99 message is displayed, then the coroutine terminates, causing the generator object to raise StopIteration.
getgeneratorstate reports GEN_CLOSED (i.e., the coroutine execution has completed).

 


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Oct 14, 2019] Coroutines and Tasks -- Python 3.7.5rc1 documentation

Oct 14, 2019 | docs.python.org

Coroutines declared with async/await syntax is the preferred way of writing asyncio applications. For example, the following snippet of code (requires Python 3.7+) prints "hello", waits 1 second, and then prints "world":

>>>

>>> import asyncio

>>> async def main():
...     print('hello')
...     await asyncio.sleep(1)
...     print('world')

>>> asyncio.run(main())
hello
world

Note that simply calling a coroutine will not schedule it to be executed:

>>>
>>> main()
<coroutine object main at 0x1053bb7c8>

To actually run a coroutine, asyncio provides three main mechanisms:

Awaitables ¶

We say that an object is an awaitable object if it can be used in an await expression. Many asyncio APIs are designed to accept awaitables.

There are three main types of awaitable objects: coroutines , Tasks , and Futures .

Coroutines

Python coroutines are awaitables and therefore can be awaited from other coroutines:

import asyncio

async def nested():
    return 42

async def main():
    # Nothing happens if we just call "nested()".
    # A coroutine object is created but not awaited,
    # so it *won't run at all*.
    nested()

    # Let's do it differently now and await it:
    print(await nested())  # will print "42".

asyncio.run(main())

Important

In this documentation the term "coroutine" can be used for two closely related concepts:

asyncio also supports legacy generator-based coroutines.

[Oct 14, 2019] Coroutine in Python - GeeksforGeeks

Oct 14, 2019 | www.geeksforgeeks.org

In Python 2.5, a slight modification to the yield statement was introduced, now yield can also be used as expression . For example on the right side of the assignment

line = (yield)

whatever value we send to coroutine is captured and returned by (yield) expression. A value can be send to the coroutine by send() method. For example, consider this coroutine which print out name having prefix "Dear" in it. We will send names to coroutine using send() method.

# Python3 program for demonstrating

# coroutine execution

def print_name(prefix):

print ( "Searching prefix:{}" . format (prefix))

while True :

name = ( yield )

if prefix in name:

print (name) # calling coroutine, nothing will happen

corou = print_name( "Dear" )

# This will start execution of coroutine and

# Prints first line "Searchig prefix..."

# and advance execution to the first yield expression

corou.__next__() # sending inputs

corou.send( "Atul" )

corou.send( "Dear Atul" )

Output:

Searching prefix:Dear
Dear Atul

Execution of Coroutine

Execution of coroutine is similar to the generator. When we call coroutine nothing happens, it runs only in response to the next() and send() method. This can be seen clearly in above example, as only after calling __next__() method, out coroutine starts executing. After this call, execution advances to the first yield expression, now execution pauses and wait for value to be sent to corou object. When first value is sent to it, it checks for prefix and print name if prefix present. After printing name it goes through loop until it encounters name = (yield) expression again.

Closing a Coroutine

Coroutine might run indefinitely, to close coroutine close() method is used. When coroutine is closed it generates GeneratorExit exception which can be catched in usual way. After closing coroutine, if we try to send values, it will raise StopIteration exception. Following is a simple example :

# Python3 program for demonstrating

# closing a coroutine

def print_name(prefix):

print ( "Searching prefix:{}" . format (prefix))

try :

while True :

name = ( yield )

if prefix in name:

print (name)

except GeneratorExit:

print ( "Closing coroutine!!" )

corou = print_name( "Dear" )

corou.__next__()

corou.send( "Atul" )

corou.send( "Dear Atul" )

corou.close()

Output:

Searching prefix:Dear
Dear Atul
Closing coroutine!!

Chaining coroutines for creating pipeline

Coroutines can be used to set pipes. We can chain together coroutines and push data through pipe using send() method. A pipe needs :

pipeline
Following is a simple example of chaining –

# Python3 program for demonstrating

# coroutine chaining

def producer(sentence, next_coroutine):

'''

Producer which just split strings and

feed it to pattern_filter coroutine

'''

tokens = sentence.split( " " )

for token in tokens:

next_coroutine.send(token)

next_coroutine.close()

def pattern_filter(pattern = "ing" , next_coroutine = None ):

'''

Search for pattern in received token

and if pattern got matched, send it to

print_token() coroutine for printing

'''

print ( "Searching for {}" . format (pattern))

try :

while True :

token = ( yield )

if pattern in token:

next_coroutine.send(token)

except GeneratorExit:

print ( "Done with filtering!!" ) def print_token():

'''

Act as a sink, simply print the

received tokens

'''

print ( "I'm sink, i'll print tokens" )

try :

while True :

token = ( yield )

print (token)

except GeneratorExit:

print ( "Done with printing!" )

pt = print_token()

pt.__next__()

pf = pattern_filter(next_coroutine = pt)

pf.__next__() sentence = "Bob is running behind a fast moving car"

producer(sentence, pf)

Output:

I'm sink, i'll print tokens
Searching for ing
running
moving
Done with filtering!!
Done with printing!

References

This article is contributed by Atul Kumar . If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to [email protected]. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above.

[Oct 13, 2019] Python generators and coroutines

The new syntax is 'yield from' ( PEP 380 ) and it allows true coroutines in Python >3.3
Nov 16, 2017 | stackoverflow.com

Python generators and coroutines Ask Question up vote down vote favorite 6

Giuseppe Maggiore ,May 10, 2011 at 10:25

I am studying coroutines and generators in various programming languages.

I was wondering if there is a cleaner way to combine together two coroutines implemented via generators than yielding back at the caller whatever the callee yields?

Let's say that we are using the following convention: all yields apart from the last one return null, while the last one returns the result of the coroutine. So, for example, we could have a coroutine that invokes another:

def A():
  # yield until a certain condition is met
  yield result

def B():
  # do something that may or may not yield
  x = bind(A())
  # ...
  return result

in this case I wish that through bind (which may or may not be implementable, that's the question) the coroutine B yields whenever A yields until A returns its final result, which is then assigned to x allowing B to continue.

I suspect that the actual code should explicitly iterate A so:

def B():
  # do something that may or may not yield
  for x in A(): ()
  # ...
  return result

which is a tad ugly and error prone...

PS: it's for a game where the users of the language will be the designers who write scripts (script = coroutine). Each character has an associated script, and there are many sub-scripts which are invoked by the main script; consider that, for example, run_ship invokes many times reach_closest_enemy, fight_with_closest_enemy, flee_to_allies, and so on. All these sub-scripts need to be invoked the way you describe above; for a developer this is not a problem, but for a designer the less code they have to write the better!

S.Lott ,May 10, 2011 at 10:38

This is all covered on the Python web site. python.org/dev/peps/pep-0342 , python.org/dev/peps/pep-0334 and numerous blogs cover this. eecho.info/Echo/python/coroutines-python . Please Google, read, and then ask specific questions based on what you've read. – S.Lott May 10 '11 at 10:38

S.Lott ,May 10, 2011 at 13:04

I thought the examples clearly demonstrated idiomatic. Since I'm unable to understand what's wrong with the examples, could you state which examples you found to be unclear? Which examples were confusing? Can you be more specific on how all those examples where not able to show idiomatic Python? – S.Lott May 10 '11 at 13:04

Giuseppe Maggiore ,May 10, 2011 at 13:09

I've read precisely those articles, and the PEP-342 leaves me somewhat confused: is it some actual extension that is currently working in Python? Is the Trampoline class shown there part of the standard libraries of the language? BTW, my question was very precise, and it was about the IDIOMATIC way to pass control around coroutines. The fact that I can read about a ton of ways to do so really does not help. Neither does your snarkiness... – Giuseppe Maggiore May 10 '11 at 13:09

Giuseppe Maggiore ,May 10, 2011 at 13:11

Idiomatic is about the "standard" way to perform some function; there is absolutely nothing wrong with iterating the results of a nested coroutine, but there are examples in the literature of programming languages where yielding automatically climbs down the call stack and so you do not need to re-yield at each caller, hence my curiosity if this pattern is covered by sintactic sugar in Python or not! – Giuseppe Maggiore May 10 '11 at 13:11

S.Lott ,May 10, 2011 at 13:19

@Giuseppe Maggiore: "programming languages where yielding automatically climbs down the call stack" That doesn't sound like the same question. Are you asking for idiomatic Python -- as shown by numerous examples -- or are you asking for some other feature that's not shown in the Python examples but is shown in other languages? I'm afraid that I can't understand your question at all. Can you please clarify what you're really looking for? – S.Lott May 10 '11 at 13:19

blubb ,May 10, 2011 at 10:37

Are you looking for something like this?
def B():
   for x in A():
     if x is None:
       yield
     else:
       break

   # continue, x contains value A yielded

Giuseppe Maggiore ,May 10, 2011 at 12:59

Yes, that is what I am doing. My question is if this is the idiomatic way or if there is some syntactic construct that is capable of hiding this pattern which recurs very often in my application. – Giuseppe Maggiore May 10 '11 at 12:59

blubb ,May 10, 2011 at 13:31

@Guiseppe Maggiore: I'm not aware of any such constructs. However, it seems strange that you need this pattern often... I can't think of many valid used cases off the top of my head. If you give more context information, maybe we can propose an alternative solution which is more elegant over all? – blubb May 10 '11 at 13:31

Giuseppe Maggiore ,May 10, 2011 at 15:17

It's for a game where the users of the language will be the designers who write scripts (script = coroutine). Each character has an associated script, and there are many sub-scripts which are invoked by the main script; consider that, for example, run_ship invokes many times reach_closest_enemy, fight_with_closest_enemy, flee_to_allies, and so on. All these sub-scripts need to be invoked the way you describe above; for a developer this is not a problem, but for a designer the less code they have to write the better! – Giuseppe Maggiore May 10 '11 at 15:17

blubb ,May 10, 2011 at 15:57

@Guiseppe Maggiore: I'd propose you add that last comment to the question so that other get a chance of answering it, too... – blubb May 10 '11 at 15:57

Simon Radford ,Nov 11, 2011 at 0:24

Edit: I recommend using Greenlet . But if you're interested in a pure Python approach, read on.

This is addressed in PEP 342 , but it's somewhat tough to understand at first. I'll try to explain simply how it works.

First, let me sum up what I think is the problem you're really trying to solve.

Problem

You have a callstack of generator functions calling other generator functions. What you really want is to be able to yield from the generator at the top, and have the yield propagate all the way down the stack.

The problem is that Python does not ( at a language level ) support real coroutines, only generators. (But, they can be implemented.) Real coroutines allow you to halt an entire stack of function calls and switch to a different stack. Generators only allow you to halt a single function. If a generator f() wants to yield, the yield statement has to be in f(), not in another function that f() calls.

The solution that I think you're using now, is to do something like in Simon Stelling's answer (i.e. have f() call g() by yielding all of g()'s results). This is very verbose and ugly, and you're looking for syntax sugar to wrap up that pattern. Note that this essentially unwinds the stack every time you yield, and then winds it back up again afterwards.

Solution

There is a better way to solve this problem. You basically implement coroutines by running your generators on top of a "trampoline" system.

To make this work, you need to follow a couple patterns: 1. When you want to call another coroutine, yield it. 2. Instead of returning a value, yield it.

so

def f():
    result = g()
    #  
    return return_value

becomes

def f():
    result = yield g()
    #  
    yield return_value

Say you're in f(). The trampoline system called f(). When you yield a generator (say g()), the trampoline system calls g() on your behalf. Then when g() has finished yielding values, the trampoline system restarts f(). This means that you're not actually using the Python stack; the trampoline system manages a callstack instead.

When you yield something other than a generator, the trampoline system treats it as a return value. It passes that value back to the caller generator through the yield statement (using .send() method of generators).

Comments

This kind of system is extremely important and useful in asynchronous applications, like those using Tornado or Twisted. You can halt an entire callstack when it's blocked, go do something else, and then come back and continue execution of the first callstack where it left off.

The drawback of the above solution is that it requires you to write essentially all your functions as generators. It may be better to use an implementation of true coroutines for Python - see below.

Alternatives

There are several implementations of coroutines for Python, see: http://en.wikipedia.org/wiki/Coroutine#Implementations_for_Python

Greenlet is an excellent choice. It is a Python module that modifies the CPython interpreter to allow true coroutines by swapping out the callstack.

Python 3.3 should provide syntax for delegating to a subgenerator, see PEP 380 .

gaborous ,Nov 9, 2012 at 10:04

Very useful and clear answer, thank's! However, when you say that standard Python coroutines essentially require to write all functions as generators, did you mean only first level functions or really all functions? As you said above, when yielding something other than a generator, the trampoline system still works, so theoretically we can just yield at the first-layer functions any other functions that may or may not be generators themselves. Am I right? – gaborous Nov 9 '12 at 10:04

Simon Radford ,Nov 21, 2012 at 21:37

All "functions" between the trampoline system and a yield must be written as generators. You can call regular functions normally, but then you can't effectively "yield" from that function or any functions it calls. Does that make sense / answer your question? – Simon Radford Nov 21 '12 at 21:37

Simon Radford ,Nov 21, 2012 at 21:39

I highly recommend using Greenlet - it's a true implementation of coroutines for Python, and you don't have to use any of these patterns I've described. The trampoline stuff is for people who are interested in how you can do it in pure Python. – Simon Radford Nov 21 '12 at 21:39

Nick Sweeting ,Jun 7, 2015 at 22:12

To anyone reading this in 2015 or later, the new syntax is 'yield from' ( PEP 380 ) and it allows true coroutines in Python >3.3 . – Nick Sweeting Jun 7 '15 at 22:12

[Oct 13, 2019] Effective Python Item 40 Consider Coroutines to Run Many Functions Concurrently

Nov 16, 2017 | www.informit.com

Threads give Python programmers a way to run multiple functions seemingly at the same time (see Item 37: "Use Threads for Blocking I/O, Avoid for Parallelism"). But there are three big problems with threads:

Python can work around all these issues with coroutines . Coroutines let you have many seemingly simultaneous functions in your Python programs. They're implemented as an extension to generators. The cost of starting a generator coroutine is a function call. Once active, they each use less than 1 KB of memory until they're exhausted.

Coroutines work by enabling the code consuming a generator to send a value back into the generator function after each yield expression. The generator function receives the value passed to the send function as the result of the corresponding yield expression.

def my_coroutine():
    while True:
        received = yield
        print('Received:', received)

it = my_coroutine()
next(it)             # Prime the coroutine
it.send('First')
it.send('Second')

>>>
Received: First
Received: Second

The initial call to next is required to prepare the generator for receiving the first send by advancing it to the first yield expression. Together, yield and send provide generators with a standard way to vary their next yielded value in response to external input.

For example, say you want to implement a generator coroutine that yields the minimum value it's been sent so far. Here, the bare yield prepares the coroutine with the initial minimum value sent in from the outside. Then the generator repeatedly yields the new minimum in exchange for the next value to consider.

def minimize():
    current = yield
    while True:
        value = yield current
        current = min(value, current)

The code consuming the generator can run one step at a time and will output the minimum value seen after each input.

it = minimize()
next(it)            # Prime the generator
print(it.send(10))
print(it.send(4))
print(it.send(22))
print(it.send(-1))

>>>
10
4
4
-1

The generator function will seemingly run forever, making forward progress with each new call to send . Like threads, coroutines are independent functions that can consume inputs from their environment and produce resulting outputs. The difference is that coroutines pause at each yield expression in the generator function and resume after each call to send from the outside. This is the magical mechanism of coroutines.

This behavior allows the code consuming the generator to take action after each yield expression in the coroutine. The consuming code can use the generator's output values to call other functions and update data structures. Most importantly, it can advance other generator functions until their next yield expressions. By advancing many separate generators in lockstep, they will all seem to be running simultaneously, mimicking the concurrent behavior of Python threads.

The Game of Life

Let me demonstrate the simultaneous behavior of coroutines with an example. Say you want to use coroutines to implement Conway's Game of Life. The rules of the game are simple. You have a two-dimensional grid of an arbitrary size. Each cell in the grid can either be alive or empty.

ALIVE = '*'
EMPTY = '-'

The game progresses one tick of the clock at a time. At each tick, each cell counts how many of its neighboring eight cells are still alive. Based on its neighbor count, each cell decides if it will keep living, die, or regenerate. Here's an example of a 5×5 Game of Life grid after four generations with time going to the right. I'll explain the specific rules further below.

  0   |   1   |   2   |   3   |   4
----- | ----- | ----- | ----- | -----
-*--- | --*-- | --**- | --*-- | -----
--**- | --**- | -*--- | -*--- | -**--
---*- | --**- | --**- | --*-- | -----
----- | ----- | ----- | ----- | -----

I can model this game by representing each cell as a generator coroutine running in lockstep with all the others.

To implement this, first I need a way to retrieve the status of neighboring cells. I can do this with a coroutine named count_neighbors that works by yielding Query objects. The Query class I define myself. Its purpose is to provide the generator coroutine with a way to ask its surrounding environment for information.

Query = namedtuple('Query', ('y', 'x'))

The coroutine yields a Query for each neighbor. The result of each yield expression will be the value ALIVE or EMPTY . That's the interface contract I've defined between the coroutine and its consuming code. The count_neighbors generator sees the neighbors' states and returns the count of living neighbors.

def count_neighbors(y, x):
    n_ = yield Query(y + 1, x + 0)  # North
    ne = yield Query(y + 1, x + 1)  # Northeast
    # Define e_, se, s_, sw, w_, nw ...
    # ...
    neighbor_states = [n_, ne, e_, se, s_, sw, w_, nw]
    count = 0
    for state in neighbor_states:
        if state == ALIVE:
            count += 1
    return count

I can drive the count_neighbors coroutine with fake data to test it. Here, I show how Query objects will be yielded for each neighbor. count_neighbors expects to receive cell states corresponding to each Query through the coroutine's send method. The final count is returned in the StopIteration exception that is raised when the generator is exhausted by the return statement.

it = count_neighbors(10, 5)
q1 = next(it)                  # Get the first query
print('First yield: ', q1)
q2 = it.send(ALIVE)            # Send q1 state, get q2
print('Second yield:', q2)
q3 = it.send(ALIVE)            # Send q2 state, get q3
# ...
try:
    count = it.send(EMPTY)     # Send q8 state, retrieve count
except StopIteration as e:
    print('Count: ', e.value)  # Value from return statement
>>>
First yield:  Query(y=11, x=5)
Second yield: Query(y=11, x=6)
...
Count:  2

Now I need the ability to indicate that a cell will transition to a new state in response to the neighbor count that it found from count_neighbors . To do this, I define another coroutine called step_cell . This generator will indicate transitions in a cell's state by yielding Transition objects. This is another class that I define, just like the Query class.

Transition = namedtuple('Transition', ('y', 'x', 'state'))

The step_cell coroutine receives its coordinates in the grid as arguments. It yields a Query to get the initial state of those coordinates. It runs count_neighbors to inspect the cells around it. It runs the game logic to determine what state the cell should have for the next clock tick. Finally, it yields a Transition object to tell the environment the cell's next state.

def game_logic(state, neighbors):
    # ...

def step_cell(y, x):
    state = yield Query(y, x)
    neighbors = yield from count_neighbors(y, x)
    next_state = game_logic(state, neighbors)
    yield Transition(y, x, next_state)

Importantly, the call to count_neighbors uses the yield from expression. This expression allows Python to compose generator coroutines together, making it easy to reuse smaller pieces of functionality and build complex coroutines from simpler ones. When count_neighbors is exhausted, the final value it returns (with the return statement) will be passed to step_cell as the result of the yield from expression.

Now, I can finally define the simple game logic for Conway's Game of Life. There are only three rules.

def game_logic(state, neighbors):
    if state == ALIVE:
        if neighbors < 2:
            return EMPTY     # Die: Too few
        elif neighbors > 3:
            return EMPTY     # Die: Too many
    else:
        if neighbors == 3:
            return ALIVE     # Regenerate
    return state

I can drive the step_cell coroutine with fake data to test it.

it = step_cell(10, 5)
q0 = next(it)           # Initial location query
print('Me:      ', q0)
q1 = it.send(ALIVE)     # Send my status, get neighbor query
print('Q1:      ', q1)
# ...
t1 = it.send(EMPTY)     # Send for q8, get game decision
print('Outcome: ', t1)

>>>
Me:       Query(y=10, x=5)
Q1:       Query(y=11, x=5)
...
Outcome:  Transition(y=10, x=5, state='-')

The goal of the game is to run this logic for a whole grid of cells in lockstep. To do this, I can further compose the step_cell coroutine into a simulate coroutine. This coroutine progresses the grid of cells forward by yielding from step_cell many times. After progressing every coordinate, it yields a TICK object to indicate that the current generation of cells have all transitioned.

TICK = object()

def simulate(height, width):
    while True:
        for y in range(height):
            for x in range(width):
                yield from step_cell(y, x)
        yield TICK

What's impressive about simulate is that it's completely disconnected from the surrounding environment. I still haven't defined how the grid is represented in Python objects, how Query , Transition , and TICK values are handled on the outside, nor how the game gets its initial state. But the logic is clear. Each cell will transition by running step_cell . Then the game clock will tick. This will continue forever, as long as the simulate coroutine is advanced.

This is the beauty of coroutines. They help you focus on the logic of what you're trying to accomplish. They decouple your code's instructions for the environment from the implementation that carries out your wishes. This enables you to run coroutines seemingly in parallel. This also allows you to improve the implementation of following those instructions over time without changing the coroutines.

Now, I want to run simulate in a real environment. To do that, I need to represent the state of each cell in the grid. Here, I define a class to contain the grid:

class Grid(object):
    def __init__(self, height, width):
        self.height = height
        self.width = width
        self.rows = []
        for _ in range(self.height):
            self.rows.append([EMPTY] * self.width)

    def __str__(self):
        # ...

The grid allows you to get and set the value of any coordinate. Coordinates that are out of bounds will wrap around, making the grid act like infinite looping space.

    def query(self, y, x):
        return self.rows[y % self.height][x % self.width]

    def assign(self, y, x, state):
        self.rows[y % self.height][x % self.width] = state

At last, I can define the function that interprets the values yielded from simulate and all of its interior coroutines. This function turns the instructions from the coroutines into interactions with the surrounding environment. It progresses the whole grid of cells forward a single step and then returns a new grid containing the next state.

def live_a_generation(grid, sim):
    progeny = Grid(grid.height, grid.width)
    item = next(sim)
    while item is not TICK:
        if isinstance(item, Query):
            state = grid.query(item.y, item.x)
            item = sim.send(state)
        else:  # Must be a Transition
            progeny.assign(item.y, item.x, item.state)
            item = next(sim)
    return progeny

To see this function in action, I need to create a grid and set its initial state. Here, I make a classic shape called a glider.

grid = Grid(5, 9)
grid.assign(0, 3, ALIVE)
# ...
print(grid)

>>>
---*-----
----*----
--***----
---------
---------

Now I can progress this grid forward one generation at a time. You can see how the glider moves down and to the right on the grid based on the simple rules from the game_logic function.

class ColumnPrinter(object):
    # ...

columns = ColumnPrinter()
sim = simulate(grid.height, grid.width)
for i in range(5):
    columns.append(str(grid))
    grid = live_a_generation(grid, sim)

print(columns)

>>>
    0     |     1     |     2     |     3     |     4
---*----- | --------- | --------- | --------- | ---------
----*---- | --*-*---- | ----*---- | ---*----- | ----*----
--***---- | ---**---- | --*-*---- | ----**--- | -----*---
--------- | ---*----- | ---**---- | ---**---- | ---***---
--------- | --------- | --------- | --------- | ---------

The best part about this approach is that I can change the game_logic function without having to update the code that surrounds it. I can change the rules or add larger spheres of influence with the existing mechanics of Query , Transition , and TICK . This demonstrates how coroutines enable the separation of concerns, which is an important design principle.

Coroutines in Python 2

Unfortunately, Python 2 is missing some of the syntactical sugar that makes coroutines so elegant in Python 3. There are two limitations. First, there is no yield from expression. That means that when you want to compose generator coroutines in Python 2, you need to include an additional loop at the delegation point.

# Python 2
def delegated():
    yield 1
    yield 2

def composed():
    yield 'A'
    for value in delegated():  # yield from in Python 3
        yield value
    yield 'B'

print list(composed())

>>>
['A', 1, 2, 'B']

The second limitation is that there is no support for the return statement in Python 2 generators. To get the same behavior that interacts correctly with try / except / finally blocks, you need to define your own exception type and raise it when you want to return a value.

# Python 2
class MyReturn(Exception):
    def __init__(self, value):
        self.value = value

def delegated():
    yield 1
    raise MyReturn(2)  # return 2 in Python 3
    yield 'Not reached'

def composed():
    try:
        for value in delegated():
            yield value
    except MyReturn as e:
        output = e.value
    yield output * 4

print list(composed())

>>>
[1, 8]
Things to Remember

[Nov 16, 2017] Python coroutines

Nov 16, 2017 | docs.python.org

Coroutines used with asyncio may be implemented using the async def statement, or by using generators . The async def type of coroutine was added in Python 3.5, and is recommended if there is no need to support older Python versions.

Generator-based coroutines should be decorated with @asyncio.coroutine , although this is not strictly enforced. The decorator enables compatibility with async def coroutines, and also serves as documentation. Generator-based coroutines use the yield from syntax introduced in PEP 380 , instead of the original yield syntax.

The word "coroutine", like the word "generator", is used for two different (though related) concepts:

Things a coroutine can do:

Calling a coroutine does not start its code running – the coroutine object returned by the call doesn't do anything until you schedule its execution. There are two basic ways to start it running: call await coroutine or yield from coroutine from another coroutine (assuming the other coroutine is already running!), or schedule its execution using the ensure_future() function or the AbstractEventLoop.create_task() method.

Coroutines (and tasks) can only run when the event loop is running.

@asyncio. coroutine
Decorator to mark generator-based coroutines. This enables the generator use yield from to call async def coroutines, and also enables the generator to be called by async def coroutines, for instance using an await expression.

There is no need to decorate async def coroutines themselves.

If the generator is not yielded from before it is destroyed, an error message is logged. See Detect coroutines never scheduled .

Note

In this documentation, some methods are documented as coroutines, even if they are plain Python functions returning a Future . This is intentional to have a freedom of tweaking the implementation of these functions in the future. If such a function is needed to be used in a callback-style code, wrap its result with ensure_future() . 18.5.3.1.1. Example: Hello World coroutine

Example of coroutine displaying "Hello World" :

import asyncio

async def hello_world():
    print("Hello World!")

loop = asyncio.get_event_loop()
# Blocking call which returns when the hello_world() coroutine is done
loop.run_until_complete(hello_world())
loop.close()

See also

The Hello World with call_soon() example uses the AbstractEventLoop.call_soon() method to schedule a callback. 18.5.3.1.2. Example: Coroutine displaying the current date

Example of coroutine displaying the current date every second during 5 seconds using the sleep() function:

import asyncio
import datetime

async def display_date(loop):
    end_time = loop.time() + 5.0
    while True:
        print(datetime.datetime.now())
        if (loop.time() + 1.0) >= end_time:
            break
        await asyncio.sleep(1)

loop = asyncio.get_event_loop()
# Blocking call which returns when the display_date() coroutine is done
loop.run_until_complete(display_date(loop))
loop.close()

See also

The display the current date with call_later() example uses a callback with the AbstractEventLoop.call_later() method. 18.5.3.1.3. Example: Chain coroutines

Example chaining coroutines:

import asyncio

async def compute(x, y):
    print("Compute %s + %s ..." % (x, y))
    await asyncio.sleep(1.0)
    return x + y

async def print_sum(x, y):
    result = await compute(x, y)
    print("%s + %s = %s" % (x, y, result))

loop = asyncio.get_event_loop()
loop.run_until_complete(print_sum(1, 2))
loop.close()

compute() is chained to print_sum() : print_sum() coroutine waits until compute() is completed before returning its result.

Sequence diagram of the example:

../_images/tulip_coro.png

The "Task" is created by the AbstractEventLoop.run_until_complete() method when it gets a coroutine object instead of a task.

The diagram shows the control flow, it does not describe exactly how things work internally. For example, the sleep coroutine creates an internal future which uses AbstractEventLoop.call_later() to wake up the task in 1 second. 18.5.3.2. InvalidStateError

exception asyncio. InvalidStateError
The operation is not allowed in this state.
18.5.3.3. TimeoutError
exception asyncio. TimeoutError
The operation exceeded the given deadline.

Note

This exception is different from the builtin TimeoutError exception! 18.5.3.4. Future

class asyncio. Future * , loop=None
This class is almost compatible with concurrent.futures.Future .

Differences:

This class is not thread safe .

cancel ()
Cancel the future and schedule callbacks.

If the future is already done or cancelled, return False . Otherwise, change the future's state to cancelled, schedule the callbacks and return True .

cancelled ()
Return True if the future was cancelled.
done ()
Return True if the future is done.

Done means either that a result / exception are available, or that the future was cancelled.

result ()
Return the result this future represents.

If the future has been cancelled, raises CancelledError . If the future's result isn't yet available, raises InvalidStateError . If the future is done and has an exception set, this exception is raised.

exception ()
Return the exception that was set on this future.

The exception (or None if no exception was set) is returned only if the future is done. If the future has been cancelled, raises CancelledError . If the future isn't done yet, raises InvalidStateError .

add_done_callback fn
Add a callback to be run when the future becomes done.

The callback is called with a single argument - the future object. If the future is already done when this is called, the callback is scheduled with call_soon() .

Use functools.partial to pass parameters to the callback . For example, fut.add_done_callback(functools.partial(print, "Future:", flush=True)) will call print("Future:", fut, flush=True) .

remove_done_callback fn
Remove all instances of a callback from the "call when done" list.

Returns the number of callbacks removed.

set_result result
Mark the future done and set its result.

If the future is already done when this method is called, raises InvalidStateError .

set_exception exception
Mark the future done and set an exception.

If the future is already done when this method is called, raises InvalidStateError .

18.5.3.4.1. Example: Future with run_until_complete()

Example combining a Future and a coroutine function :

import asyncio

async def slow_operation(future):
    await asyncio.sleep(1)
    future.set_result('Future is done!')

loop = asyncio.get_event_loop()
future = asyncio.Future()
asyncio.ensure_future(slow_operation(future))
loop.run_until_complete(future)
print(future.result())
loop.close()

The coroutine function is responsible for the computation (which takes 1 second) and it stores the result into the future. The run_until_complete() method waits for the completion of the future.

Note

The run_until_complete() method uses internally the add_done_callback() method to be notified when the future is done. 18.5.3.4.2. Example: Future with run_forever()

The previous example can be written differently using the Future.add_done_callback() method to describe explicitly the control flow:

import asyncio

async def slow_operation(future):
    await asyncio.sleep(1)
    future.set_result('Future is done!')

def got_result(future):
    print(future.result())
    loop.stop()

loop = asyncio.get_event_loop()
future = asyncio.Future()
asyncio.ensure_future(slow_operation(future))
future.add_done_callback(got_result)
try:
    loop.run_forever()
finally:
    loop.close()

In this example, the future is used to link slow_operation() to got_result() : when slow_operation() is done, got_result() is called with the result. 18.5.3.5. Task

class asyncio. Task coro , * , loop=None
Schedule the execution of a coroutine : wrap it in a future. A task is a subclass of Future .

A task is responsible for executing a coroutine object in an event loop. If the wrapped coroutine yields from a future, the task suspends the execution of the wrapped coroutine and waits for the completion of the future. When the future is done, the execution of the wrapped coroutine restarts with the result or the exception of the future.

Event loops use cooperative scheduling: an event loop only runs one task at a time. Other tasks may run in parallel if other event loops are running in different threads. While a task waits for the completion of a future, the event loop executes a new task.

The cancellation of a task is different from the cancelation of a future. Calling cancel() will throw a CancelledError to the wrapped coroutine. cancelled() only returns True if the wrapped coroutine did not catch the CancelledError exception, or raised a CancelledError exception.

If a pending task is destroyed, the execution of its wrapped coroutine did not complete. It is probably a bug and a warning is logged: see Pending task destroyed .

Don't directly create Task instances: use the ensure_future() function or the AbstractEventLoop.create_task() method.

This class is not thread safe .

classmethod all_tasks loop=None
Return a set of all tasks for an event loop.

By default all tasks for the current event loop are returned.

classmethod current_task loop=None
Return the currently running task in an event loop or None .

By default the current task for the current event loop is returned.

None is returned when called not in the context of a Task .

cancel ()
Request that this task cancel itself.

This arranges for a CancelledError to be thrown into the wrapped coroutine on the next cycle through the event loop. The coroutine then has a chance to clean up or even deny the request using try/except/finally.

Unlike Future.cancel() , this does not guarantee that the task will be cancelled: the exception might be caught and acted upon, delaying cancellation of the task or preventing cancellation completely. The task may also return a value or raise a different exception.

Immediately after this method is called, cancelled() will not return True (unless the task was already cancelled). A task will be marked as cancelled when the wrapped coroutine terminates with a CancelledError exception (even if cancel() was not called).

get_stack * , limit=None
Return the list of stack frames for this task's coroutine.

If the coroutine is not done, this returns the stack where it is suspended. If the coroutine has completed successfully or was cancelled, this returns an empty list. If the coroutine was terminated by an exception, this returns the list of traceback frames.

The frames are always ordered from oldest to newest.

The optional limit gives the maximum number of frames to return; by default all available frames are returned. Its meaning differs depending on whether a stack or a traceback is returned: the newest frames of a stack are returned, but the oldest frames of a traceback are returned. (This matches the behavior of the traceback module.)

For reasons beyond our control, only one stack frame is returned for a suspended coroutine.

print_stack * , limit=None , file=None
Print the stack or traceback for this task's coroutine.

This produces output similar to that of the traceback module, for the frames retrieved by get_stack(). The limit argument is passed to get_stack(). The file argument is an I/O stream to which the output is written; by default output is written to sys.stderr.

18.5.3.5.1. Example: Parallel execution of tasks

Example executing 3 tasks (A, B, C) in parallel:

import asyncio

async def factorial(name, number):
    f = 1
    for i in range(2, number+1):
        print("Task %s: Compute factorial(%s)..." % (name, i))
        await asyncio.sleep(1)
        f *= i
    print("Task %s: factorial(%s) = %s" % (name, number, f))

loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.gather(
    factorial("A", 2),
    factorial("B", 3),
    factorial("C", 4),
))
loop.close()

Output:

Task A: Compute factorial(2)...
Task B: Compute factorial(2)...
Task C: Compute factorial(2)...
Task A: factorial(2) = 2
Task B: Compute factorial(3)...
Task C: Compute factorial(3)...
Task B: factorial(3) = 6
Task C: Compute factorial(4)...
Task C: factorial(4) = 24

A task is automatically scheduled for execution when it is created. The event loop stops when all tasks are done. 18.5.3.6. Task functions

Note

In the functions below, the optional loop argument allows explicitly setting the event loop object used by the underlying task or coroutine. If it's not provided, the default event loop is used.

asyncio. as_completed fs , * , loop=None , timeout=None
Return an iterator whose values, when waited for, are Future instances.

Raises asyncio.TimeoutError if the timeout occurs before all Futures are done.

Example:

for f in as_completed(fs):
    result = yield from f  # The 'yield from' may raise
    # Use result

Note

The futures are not necessarily members of fs.

asyncio. ensure_future coro_or_future , * , loop=None
Schedule the execution of a coroutine object : wrap it in a future. Return a Task object.

If the argument is a Future , it is returned directly.

New in version 3.4.4. Changed in version 3.5.1: The function accepts any awaitable object.

See also

The AbstractEventLoop.create_task() method.

asyncio. async coro_or_future , * , loop=None
A deprecated alias to ensure_future() . Deprecated since version 3.4.4.
asyncio. wrap_future future , * , loop=None
Wrap a concurrent.futures.Future object in a Future object.
asyncio. gather *coros_or_futures , loop=None , return_exceptions=False
Return a future aggregating results from the given coroutine objects or futures.

All futures must share the same event loop. If all the tasks are done successfully, the returned future's result is the list of results (in the order of the original sequence, not necessarily the order of results arrival). If return_exceptions is true, exceptions in the tasks are treated the same as successful results, and gathered in the result list; otherwise, the first raised exception will be immediately propagated to the returned future.

Cancellation: if the outer Future is cancelled, all children (that have not completed yet) are also cancelled. If any child is cancelled, this is treated as if it raised CancelledError – the outer Future is not cancelled in this case. (This is to prevent the cancellation of one child to cause other children to be cancelled.)

asyncio. iscoroutine obj
Return True if obj is a coroutine object , which may be based on a generator or an async def coroutine.
asyncio. iscoroutinefunction func
Return True if func is determined to be a coroutine function , which may be a decorated generator function or an async def function.
asyncio. run_coroutine_threadsafe coro , loop
Submit a coroutine object to a given event loop.

Return a concurrent.futures.Future to access the result.

This function is meant to be called from a different thread than the one where the event loop is running. Usage:

# Create a coroutine
coro = asyncio.sleep(1, result=3)
# Submit the coroutine to a given loop
future = asyncio.run_coroutine_threadsafe(coro, loop)
# Wait for the result with an optional timeout argument
assert future.result(timeout) == 3

If an exception is raised in the coroutine, the returned future will be notified. It can also be used to cancel the task in the event loop:

try:
    result = future.result(timeout)
except asyncio.TimeoutError:
    print('The coroutine took too long, cancelling the task...')
    future.cancel()
except Exception as exc:
    print('The coroutine raised an exception: {!r}'.format(exc))
else:
    print('The coroutine returned: {!r}'.format(result))

See the concurrency and multithreading section of the documentation.

Note

Unlike other functions from the module, run_coroutine_threadsafe() requires the loop argument to be passed explicitly. New in version 3.5.1.

coroutine asyncio. sleep delay , result=None , * , loop=None
Create a coroutine that completes after a given time (in seconds). If result is provided, it is produced to the caller when the coroutine completes.

The resolution of the sleep depends on the granularity of the event loop .

This function is a coroutine .

asyncio. shield arg , * , loop=None
Wait for a future, shielding it from cancellation.

The statement:

res = yield from shield(something())

is exactly equivalent to the statement:

res = yield from something()

except that if the coroutine containing it is cancelled, the task running in something() is not cancelled. From the point of view of something() , the cancellation did not happen. But its caller is still cancelled, so the yield-from expression still raises CancelledError . Note: If something() is cancelled by other means this will still cancel shield() .

If you want to completely ignore cancellation (not recommended) you can combine shield() with a try/except clause, as follows:

try:
    res = yield from shield(something())
except CancelledError:
    res = None
coroutine asyncio. wait futures , * , loop=None , timeout=None , return_when=ALL_COMPLETED
Wait for the Futures and coroutine objects given by the sequence futures to complete. Coroutines will be wrapped in Tasks. Returns two sets of Future : (done, pending).

The sequence futures must not be empty.

timeout can be used to control the maximum number of seconds to wait before returning. timeout can be an int or float. If timeout is not specified or None , there is no limit to the wait time.

return_when indicates when this function should return. It must be one of the following constants of the concurrent.futures module:

Constant Description
FIRST_COMPLETED The function will return when any future finishes or is cancelled.
FIRST_EXCEPTION The function will return when any future finishes by raising an exception. If no future raises an exception then it is equivalent to ALL_COMPLETED .
ALL_COMPLETED The function will return when all futures finish or are cancelled.

This function is a coroutine .

Usage:

done, pending = yield from asyncio.wait(fs)

Note

This does not raise asyncio.TimeoutError ! Futures that aren't done when the timeout occurs are returned in the second set.

coroutine asyncio. wait_for fut , timeout , * , loop=None
Wait for the single Future or coroutine object to complete with timeout. If timeout is None , block until the future completes.

Coroutine will be wrapped in Task .

Returns result of the Future or coroutine. When a timeout occurs, it cancels the task and raises asyncio.TimeoutError . To avoid the task cancellation, wrap it in shield() .

If the wait is cancelled, the future fut is also cancelled.

This function is a coroutine , usage:

result = yield from asyncio.wait_for(fut, 60.0)
Changed in version 3.4.3: If the wait is cancelled, the future fut is now also cancelled.

Recommended Links

Google matched content

Softpanorama Recommended

Top articles

Sites

Coroutines and Tasks - Python 3.7.5rc1 documentation

Coroutines In Python - Nightmare Software

A Curious Course on Coroutines and Concurrency - Dabeaz LLC

Coroutines in Python " Python recipes " ActiveState Code

PEP 342 -- Coroutines via Enhanced Generators

decorator - Coroutines in python - Stack Overflow

Coroutine - Wikipedia, the free encyclopedia

Python Coroutines - Through the Fog

Talk-Coroutine - Wikipedia, the free encyclopedia

Boduch's Blog- Python Coroutines and Counters

This Tutorial - Dabeaz LLC

CC Blog- Forking and Joining Python Coroutines to Collect

Charming Python- Implementing "weightless threads" with Python

CERT-CC Blog Forking and Joining Python Coroutines to Collect Coverage Data

Charming Python- Implementing "weightless threads" with Python

Generating Infinite Sequences with Python Generators and

CHARMING PYTHON #B5- -- Generator-based State Machines …

Generators Are Not Coroutines - Cunningham & Cunningham, Inc

Portable Stackless Coroutines in One- Header " C++Next

Chapter 5- Sequences and Coroutines - Weiner Lecture Archives

might as well hack it | JSON parsing with Python coroutines

Concurrency - Python Programming Language – Official Website

lua-users wiki- Lua Coroutines Versus Python Generators



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: July, 28, 2020