Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Nikolai Bezroukov. Portraits of Open Source Pioneers

For readers with high sensitivity to grammar errors access to this page is not recommended :-)


Introduction to the Unix shell history

The UNIX system was one of the first operating systems that didn't make the command interpreter a part of the operating system or a privileged task. It became an application program somewhat similar to compilers. This idea, like many other Unix ideas, was taken from Multics which in turn was influenced by CTSS, developed by Fernando J. Corbató .

For CTSS MIT Computation Center staff member Louis Pouzin created for CTSS a command called RUNCOM, which executed a list of commands contained in a file. RUNCOM  allowed parameter substitution. He later created a design for the Multics shell, which in turn inspired Unix shell scripts

This approach to shell as a separate program pioneered by CTSS and Multics made Unix a fertile ground for scripting development and Unix essentially pioneered scripting as we know it: Unix was the development  environment in which AWK, C-shell,  ksh and later Perl emerged and gain initial popularity, before they ware ported to other OSes.

Another influence was system 360 with its JCL language developed around 1962 for world-famous IBM System/360 series.  In early 1970th it was a dominant computer family.

I would like to stress again that shell in Unix was a stand-alone program, a command interpreter with few special permissions and kernel level calls. This rather novel concept proved to be fruitful and has led to a succession of better and better shells.

Unix shells have a long history and we can talk about four distinct generations of Unix shells:

  1. First generation shells  (Thomson shell and Mashey shell)
  2. Second generation shells (C-shell and Bourne shell, which were developed in parallel and in which Bill Joy  team provided  devastating blow to developers from AT&T, and  wiped out the floor with Bourne shell  ;-).  In many ways C-shell represented a real breakthrough as it was more programmable (even the name implies closeness to C) and more flexible then any of the first generation shells and Bourne shell.
  3. Third generation shells (tcsh and ksh88)
  4. Fourth generation shells (ksh93, bash, zsh and, especially, Microsoft Power Shell)

First generation shells

The first generation shells were descendant of Multics shell. The first of them was Thomson shell. It was the first Unix shell and as such it was very primitive with only basic control structures and no variables. The shell's design was intentionally minimalistic; even the if and goto statements, essential for control of program flow, were implemented as separate external commands [ Thompson shell - Wikipedia, the free encyclopedia] Despite its shortcomings Thomson shell was a definite improvement over the shell that at the time people used most (IBM JCL).

An important early feature of the Thompson shell, new in comparison with Multics, was more compact syntax for input/output redirection. Multics shell used separate commands for redirection of the input or output of a command. One command was needed to start redirection and one to stop it; in Unix, one could simply add an argument consisting of the "< " symbol followed by a filename for input or the ">" symbol for output to the command line, and the shell would redirect I/O for the duration of the command. This syntax was already present by the release of the first version of Unix in 1971. A later addition to Thomson shell  was the concept of pipes. At the suggestion of Douglas McIlroy, the redirection syntax was expanded so that the output of one command could be passed to the input of another command. By Version 4,  the "|" symbol was adopted to use for pipes. Both redirection symbols and semantic and pipe symbol and semantic survived to those days and was used in other operating system including DOS and Windows.

The Mashey shell (also known as PWB shell) was the second representative of the first generation of  Unix shells.  It was written and maintained by John Mashey (now a chief scientist at Silicon Graphics, who since 1998 owns a nameplate UNIX in California, previously owned by Ted Dolotta ;-).  It was distributed with Programmer's Workbench UNIX one of early versions of Unix that existed from 1975 to 1977.

As Mashey recollected in  Languagesm Levels, Libraries and Longevity

In 1970, “real computers” were still mainframes, although minicomputers were seeing increasing use. The DEC (Digital Equipment Corporation) 16-bit PDP-11 was introduced in 1970, and of particular importance, the PDP-11/45 appeared in 1972, with up to 248 KB of MOS (metal-oxide semiconductor) memory. By 1975, the PDP-11/70 allowed a huge increase to 4 MB, although each program was still restricted to 64 KB instructions and 64 KB data. Some sites supported 16 simultaneous users on an 11/45, and with heroic effort, 48 on an 11/70. The VAX-11/780 was introduced in 1977 and spread lower-cost 32-bit computing more widely. By the end of the decade, minicomputers were “real computers,” and 32-bit microcomputers were beginning to appear.

In 1970, there was widespread use of applications languages such as Fortran, Cobol, and PL/I, but many applications’ and most systems’ codes were still written in assembly language, and the idea that an operating system would be portable among machines was laughable. In the end, Unix was ported to many systems, C was widely used, and applications were being written with various combinations of higher-level tools.

In 1973, the PWB (Programmer’s Workbench) began in a Bell Labs software tools department.4 It supported a 1,000-person division that produced database and communications application software products that ran on various mainframes and minicomputers. It wished to move many programming activities off expensive mainframes onto a common Unix-based development environment to avoid the creation of unique support software for each target system. Programming departments needed to be convinced to change their ways, that Unix was a good thing, that minicomputers were not toys, and that they should transfer budget to the tools department for more PDP-11s.

In 1973, most of the several dozen existing Unix systems were the property of individual departments, used by small numbers of people for their own projects and administered informally, sometimes with minimal security. The PWB site was the first in Bell Labs to run a “Unix computer center” for shared general use among departments, including typing pools. For years it was the largest single Unix site, and it often endured early encounters with problems of scalability, system administration, charging, security, automation, and usability for nontechnical users.

In 1973, Ken Thompson’s Unix shell was primarily used as an interactive interpreter, but had some rudimentary scripting ability, including separate IF and GOTO commands. In 1974, I used shell scripts to build a small document management package for a potential client department and found this to be a great way to build such software quickly, but with awkward restrictions. In 1975 and 1976 the PWB’s shell got simple variables, better control structures (IF-THEN-ELSE-ENDIF, SWITCH, WHILE), and interrupt-catching. The variables that later became $HOME (home directory) and $PATH (variable search path for commands) date from this effort.

Shell programming rapidly became a widespread mechanism for PWB users to help automate their work.5 Substantial CPU time was consumed by shell procedures, to the point where previously separate commands, such as IF, GOTO, and SWITCH, were moved into the shell itself with substantial performance improvements. Steve Bourne was then working on a brand-new shell in Computing Research and, after much discussion, evolved a fresh design whose performance and features were interesting enough to eventually replace the PWB shell. The variables got generalized into the “environment variables” designed for 7th Edition Unix.

Al Aho, Peter Weinberger, and Brian Kernighan had written awk, the philosophical ancestor of some popular current scripting languages. In the 1970s, Bell Labs was busily constructing computer systems to improve Bell System operations, and many were built on Unix and even used scripting languages in delivered software. CRAS (Cable Repair Adminstrative System) was a data-mining software package that integrated data from several other systems, was distributed between IBM mainframes and Unix minicomputers, had to be deployed quickly, and was sensitive to organization-dependent requirements in a time of major reorganization.6 The first version included 10 KLOC (thousands of lines of code) of C plus 15 KLOC of shell+awk scripts and was modified quickly in the field to adapt to newly revealed customer requirements. A large listing of these scripts appeared on Kernighan’s desk—to his great surprise, as the awk writers had never expected such extensive use in production.

Shell scripting used late-binding, high-level interpretation to combine higher-performance, compiled components. Awk gave us a more flexible language above C, although we sometimes later converted heavily used awk to C for performance, after requirements had settled. We sometimes wished for an awk compiler. Raising the language level enabled vast improvements in productivity in that decade, as C replaced assembly language, and script-level languages greatly augmented C.

See also David Korn note on the subject Article by David Korn

While it was a derivative of Thomson shell, Mashey shell  introduced several features that make shell more suitable for programming.  Among important innovations introduced by Mashey shell:

The second generation of Unix shells

The next, the second generation of shells was defined by C-shell and Bourne shell which were developed in parallel. And that parallelism actually shows tremendous difference in talents between Bill Joy and Steve Bourne.  While Bourne shell was hampered by compatibility requirements, Steve Bourne did a remarkably poor job in the area of lexical structure and syntax. It's fair to say that Bourne shell has no relation whatsoever to his previous experience with Algol68 compilers.

 Any notion that a person who participated in Algol68 compiler development would make architectural decisions that were made in Bourne shell is a joke. In other words as a product Bourne shell is a shame for any decent compiler writer.  Its a completely amateurish product. That is very strange as  Steve Bourne run a project in the Cambridge Computer Laboratory to implement a version of Algol 68 called ALGOL68C. And Algol68 compiler designers were the elite of compiler writing community.  Adoption of Algol-68 keywords (and "symmetrical" ending words as in if-then-fi )  for if statements and loops are actually that only signs that Steve Borne participated in development of Algol-68 compiler. Clearly he did not understood classic compiler technology well enough to use it in Borne shell development (and David Gries seminal book Compiler Construction for Digital Computers that provides detailed coverage of the state of the art of compiler technology up to late 60th was published in 1971)...

In a way we still are suffering from Steve Bourne mistakes and blunders. It's actually amazing that work on AWK, a definitely superior design and architecture was by-and-large a parallel development which never seriously influenced the design of Bourne shell.  Lately he tried to promote  a revisionist version of history and in his latest interview to Computer World (Bourne shell, or sh  - Computerworld) forgot to mention that the prototype for his shell was Mashey shell, not the original Thompson shell and the former did had switch and while constructs (see above; also see Mashe shell man page):

My own interest, before I went to Bell Labs, was in programming language design and compilers. At Cambridge I had worked on the language ALGOL68 with Mike Guy. A small group of us wrote a compiler for ALGOL68 that we called ALGOL68C. We also made some additions to the language to make it more usable. As an aside we boot strapped the compiler so that it was also written in ALGOL68C.

When I arrived at Bell Labs a number of people were looking at ways to add programming capabilities such as variables and control flow primitives to the original shell. One day [mid 1975?] Dennis [Ritchie] and I came out of a meeting where somebody was proposing yet another variation by patching over some of the existing design decisions that were made in the original shell that Ken wrote. And so I looked at Dennis and he looked at me and I said “you know we have to re-do this and re-think some of the original design decisions that were made because you can’t go from here to there without changing some fundamental things”. So that is how I got started on the new shell.

Was there a particular problem that the language aimed to solve?

The primary problem was to design the shell be a fully programmable scripting language that could also serve as the interface to users typing commands interactively at a terminal.

First of all, it needed to be compatible with the existing usage that people were familiar with. There were two usage modes. One was scripting and even though it was very limited there were already many scripts people had written. Also, the shell or command interpreter reads and executes the commands you type at the terminal. And so it is constrained to be both a command line interpreter and a scripting language. As the Unix command line interpreter, for example, you wouldn’t want to be typing commands and have all the strings quoted like you would in C, because most things you type are simply uninterpreted strings. You don’t want to type ls directory and have the directory name in string quotes because that would be such a royal pain. Also, spaces are used to separate arguments to commands. The basic design is driven from there and that determines how you represent strings in the language, which is as un-interpreted text. Everything that isn’t a string has to have something in front of it so you know it is not a string. For example, there is $ sign in front of variables. This is in contrast to a typical programming language, where variables are names and strings are in some kind of quote marks. There are also reserved words for built-in commands like for loops but this is common with many programming languages.

So that is one way of saying what the problem was that the Bourne Shell was designed to solve. I would also say that the shell is the interface to the Unix system environment and so that’s its primary function: to provide a fully functional interface to the Unix system environment so that you could do anything that the Unix command set and the Unix system call set will provide you. This is the primary purpose of the shell.

One of the other things we did, in talking about the problems we were trying to solve, was to add environment variables to Unix system. When you execute a command script you want to have a context for that script to operate in. So in the old days, positional parameters for commands were the primary way of passing information into a command. If you wanted context that was not explicit then the command could resort to reading a file. This is very cumbersome and in practice was only rarely used. We added environment variables to Unix. These were named variables that you didn’t have to explicitly pass down from the parent to the child process. They were inherited by the child process. As an example you could have a search path set up that specifies the list of directories to used when executing commands. This search path would then be available to all processes spawned by the parent where the search path was set. It made a big difference to the way that shell programming was done because you could now see and use information that is in the environment and the guy in the middle didn’t have to pass it to you. That was one of the major additions we made to the operating system to support scripting.

How did it improve on the Thompson shell?

I did change the shell so that command scripts could be used as filters. In the original shell this was not really feasible because the standard input for the executing script was the script itself. This change caused quite a disruption to the way people were used to working. I added variables, control flow and command substitution. The case statement allowed strings to be easily matched so that commands could decode their arguments and make decisions based on that. The for loop allowed iteration over a set of strings that were either explicit or by default the arguments that the command was given.

I also added an additional quoting mechanism so that you could do variable substitutions within quotes. It was a significant redesign with some of the original flavor of the Thompson shell still there. Also I eliminated goto in favour of flow control primitives like if and for. This was also considered rather radical departure from the existing practice.

Command substitution was something else I added because that gives you very general mechanism to do string processing; it allows you to get strings back from commands and use them as the text of the script as if you had typed it directly. I think this was a new idea that I, at least, had not seen in scripting languages, except perhaps LISP.


How long did this process take?

It didn’t take very long; it’s surprising. The direct answer to the question is about maybe 3-6 months at the most to make the basic design choices and to get it working. After that I iterated the design and fixed bugs based on user feedback and requests.

I honestly don’t remember exactly but there were a number of design things I added at the time. One thing that I thought was important was to have no limits imposed by the shell on the sizes of strings or the sizes of anything else for that matter. So the memory allocation in the implementation that I wrote was quite sophisticated. It allowed you to have strings that were any length while also maintaining a very efficient string processing capability because in those days you couldn’t use up lots of instructions copying strings around. It was the implementation of the memory management that took the most time. Bugs in that part of any program are usually the hardest to find. This part of the code was worked on after I got the initial design up and running.

The memory management is an interesting part of the story. To avoid having to check at run time for running out of memory for string construction I used a less well known property of the sbrk system call. If you get a memory fault you can, in Unix, allocate more memory and then resume the program from where it left off. This was an infrequent event but made a significant difference to the performance of the shell. I was assured at the time by Dennis if this was part of the sbrk interface definition. However, everyone who ported Unix to another computer found this out when trying to port the shell itself. Also at that time at Bell Labs, there were other scripting languages that had come into existence in different parts of the lab. These were efforts to solve the same set of problems I already described. The most widely used “new” shell was in the programmer’s workbench -- John Mashey wrote that. And so there was quite an investment in these shell scripts in other parts of the lab that would require significant cost to convert to a the new shell.

The hard part was convincing people who had these scripts to convert them. While the shell I wrote had significant features that made scripting easier, the way I convinced the other groups was with a performance bake off. I spent time improving the performance, so that probably took another, I don’t know, 6 months or a year to convince other groups at the lab to adopt it. Also, some changes were made to the language to make the conversion of these scripts less painful.

How come it fell on you to do this?

The way it worked in the Unix group [at Bell Labs] was that if you were interested in something and nobody else owned the code then you could work on it. At the time Ken Thompson owned the original shell but he was visiting Berkeley for the year and he wasn’t considering working on a new shell so I took it on. As I said I was interested in language design and had some ideas about making a programmable command language.

A very influential development of AWK by far more talented team which enjoyed more advanced design, understanding of scripting and clean lexical and syntax levels happened on sidelines. Like in any large organization left hand in Bell Labs does not know what right hand was doing. Due to this unfortunate fact and complete lack of architectural vision of Unix project managers AWK did not get enough traction to be extended into replacement or more tightly integrated into shell.  BTW the fact the Steve Bourne was unable/unwilling to borrow ideas from AWK development is also pretty telling.

Another parallel development was REXX which later was used as a shell in VM/CMS environment:

Borne shell vs C-shell: how Bill Joy essentially wiped the floor with AT&T shell design team

Bourne Shell was released in 1978 and csh was written in the same year and released with 2bsd. So they can be considered to be two parallel, competing development. And this is an interesting and not very typical case when a single programmer wiped the flow with the whole AT&T shell design team.

REXX: a proof that in late 70th shells became the idea which time has come

The first important non-Unix scripting language and shell that was more or less widely used was probably REXX which was designed and first implemented between 1979 and mid-1982 by Mike Cowlishaw of IBM.   It trails AWK by several years, but by-and-large can be considered parallel independent reinvention of the concept.

The pioneering feature of REXX was that it from the beginning was designed as a macro language and can be used as an internal language for applications. One such application (Xedit) survived from those dinosaur epoch.  It was used as shell in VM/CMS and later in Amiga OS and OS/2.

Later the attempt to create a scripting language which can be used as a standard macro language for applications and spare developers of complex application from reinventing the wheel was repeated twice:  TCL and later LUA were both designed with explicitly this purpose.   None became Unix standard.

The third generation of shells

The third generation of shells was by-and large a reaction on C-shell.  It includes the extension of C-shell called tcsh and Korn shell (version which later became known as ksh88). POSIX shell belongs to the same category.

Forth generation of shells

Perl, the first "post-modern" scripting language was first released in 1987. It represented a breakthrough that lifted scripting to a qualitatively new level. It was immediately clear that for Unix scripts Perl represents a much better language then shells, although in some parts in was lower level language (limited support of pipes).

All subsequent popular scripting languages like Python, PHP, Ruby to name a few are direct derivatives of Perl. The forth generation of shell were also influenced by Perl popularity and one of them (ksh93) was, in part, a reaction to Perl popularity.

Another important development was the creation of  Tcl. Tcl was first scripting language designed to be a macro language for other applications. While Perl was an attempt to integrate functionality of the shell and AWK into a single product TCL extended shell functionality into new area. TCL never managed to replace shell or became standard Unix macro language for applications.  It was never used as a shell although it is probably is not so difficult to create a shell based on Tcl (one was created on the base of Korn shell by David Korn's son). There were also some attempts to use Perl as a shell (Perl shell) but they failed.

Two new "forth-generation" shells were written after Perl was created and incorporated some innovations introduced by Perl and TCL but in old shell-style framework:

Back into the future -- Retro shells

There was also distinct line of development that can be called retro-shells. One of the most prominent members of this family was pdksh which was an attempt to re-implement ksh88. It did not get much traction and only bash, another retro-shell, remains viable those days as it is used as the standard shell in Linux.   

Possible future developments

Still one needs to understand that currently shells are pretty archaic family of scripting languages and outside of interactive usage they generally outlived their usefulness. That's why for more or less complex tasks Perl, Python or Ruby are now usually used instead of shells.

While shells continued to improve since the original C-shell and Korn shell (Bourne shell from the language designer standpoint is just a joke), the shell syntax is frozen in space and time and now looks completely archaic.  There are a large number of problems with this syntax as it does not cleanly separate lexical analysis from syntax analysis.

Some syntax features in shell are idiosyncratic as still preserve Steve Bourne flirt with Algol 68 syntax  He proved to be a bad language designer: Bourne Shell never has distinct lexical level, also there is very little logic in how different types of blocks are ended. Conditional statements end with broken classic Algor-68 the reverse keyword syntax: 'if condition; then echo yes; else echo no; fi', but loops are structured like a perverted version of PL/1 (do; done;) , individual case branches blocks ends with ';;' . Functions have C-style bracketing "{", "}".  M. D. McIlroy  as his manager should be ashamed as the result ;-).

Also the original Bourne shell was a almost pure macro language. It performed variable substitution, tokenization and other operations on one line at a time without understanding the underlying syntax. This results in many unexpected side effects: Consider a simple command
rm $file
If variable $file is accidentally contains space that leads to treating it as two separate augments to the rm command with possible nasty side effects.  To fix this, the user has to make sure every use of a variable in enclosed in quotes, like in rm "$file"

Variable assignments in Bourne shell are whitespace sensitive. 'foo=bar' is an assignment, but 'foo = bar' is not. This is another strange idiosyncrasy, that reflect absence of lexical level in Bourne shell. 

There is also an overlap between aliases and functions. Aliases are positional macros that are recognized only as the first word of the command like in classic  alias ll='ls -l'.  Because of this, aliases have several limitations:

Functions are not positional and can in most cases emulate aliases functionality:
ll() { ls -l $*; }
The curly brackets are treated as some sort of pseudo-commands, so unfortunately skipping the semicolon in the example above results in a syntax error. As there is no clean separation between lexical analysis and syntax analysis  removing the whitespace between the opening bracket and 'ls' will also result in a syntax error.

Since the use of variables as commands is allowed, it is impossible to reliably check the syntax of a script as substitution can accidentally result in keyword as in example that I found in the paper about fish (not that I like or recommend fish):

if true; then if [ $RANDOM -lt 1024 ]; then END=fi; else END=true; fi; $END
Both bash and zsh try to determine if the command in the current buffer is finished when the user presses the return key, but because of issues like this, they sometimes fail.

One way to alleviate those problems is the introduction of deprecation mechanism. It can be done via POSIX framework (which currently is stagnant).

 Clean definition of lexical level in shells can help too. As the number of legacy scripts is substantial any improvements can be done only with the simultaneous introduction of automatic converters.

Paradoxically currently the only sizable entity working on improvement of  shells is Microsoft.  According to Wikipedia:

Microsoft is working on the next version of PowerShell and has made a CTP release of the same publicly available. It includes changes to the scripting language and hosting API, in addition to including a number of cmdlets. A non-exhaustive list of the new features is:[27]

  1. PowerShell Remoting: Using WS-Management, PowerShell 2.0 allows scripts and cmdlets to be invoked on a remote machine or a large set of remote machines.
  2. Background Jobs: Also called a PSJob, it allows a command sequence (script) or pipeline to be invoked asynchronously. Jobs can be run on the local machine or on multiple remote machines. A PSJob cannot include interactive cmdlets.
  3. ScriptCmdlets: These are cmdlets written using the PowerShell scripting language.
  4. SteppablePipelines: This allows the user to control when the BeginProcessing(), ProcessRecord() and EndProcessing() functions of a cmdlet are called.
  5. Data Language: A domain-specific subset of the PowerShell scripting language, that allows data definitions to be decoupled from the scripts and allow localized string resources to be imported into the script at runtime.
  6. Script Debugging: It allows breakpoints to be set in a PowerShell script or function. Breakpoints can be set on lines, line & columns, commands and read or write access of variables. It includes a set of cmdlets to control the breakpoints via script.
  7. New Cmdlets: Including Out-GridView, which displays tabular data in the WPF GridView object.
  8. New Operators: -Split -Join and @ operators.
  9. New APIs: The new APIs range from handing more control over the PowerShell parser and runtime to the host, to creating and managing collection of Runspaces (Runspace Pools) as well as the ability to create restricted Runspaces which only allow a configured subset of PowerShell to be invoked.
  10. Graphical PowerShell: PowerShell 2.0 includes a GUI-based PowerShell host that provides integrated debugger, syntax highlighting and up to 8 PowerShell consoles (Runspaces) in a tabbed UI, as well as to run only the selected parts in a script.. However, it is a very early release which suffers from performance problems with large scripts and does not include tab completion.



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: March 12, 2019