Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

The History of the Development of Programming Languages

News

Programming Languages Design

Recommended Links

Compilers Algorithms

History of Computing

General Readings on Languages History Early compilers source
Structured programming Design patterns Extreme Programming CMM (Capability Maturity Model)      
Fortran Algol & Simula PL/1 Basic PL/360 XPL Algol-68
 Pascal Modula-2 C C++ Smalltalk Java Icon
Unix shells C shell Korn shell bash zsh    
Perl TCL Python JavaScript Ruby Humor Etc
 
Ordinarily technology changes fast. But programming languages are different: programming languages are not just technology, but what programmers think in.

They're half technology and half religion. And so the median language, meaning whatever language the median programmer uses, moves as slow as an iceberg.

Paul Graham: Beating the Averages

Libraries are more important that the language.

Donald Knuth


Introduction

A fruitful way to think about language development is to consider it a to be special type of theory building. Peter Naur suggested that programming in general is theory building activity in his 1985 paper "Programming as Theory Building". But idea is especially applicable to compilers and interpreters. What Peter Naur failed to understand was that design of programming languages has religious overtones and sometimes represent an activity, which is pretty close to the process of creating a new, obscure cult ;-). Clueless academics publishing junk papers at obscure conferences are high priests of the church of programming languages. some like Niklaus Wirth and Edsger W. Dijkstra (temporary) reached the status close to those of (false) prophets :-).

On a deep conceptual level building of a new language is a human way of solving complex problems. That means that complier construction in probably the most underappreciated paradigm of programming of large systems much more so then greatly oversold object-oriented programming. OO benefits are greatly overstated. For users, programming languages distinctly have religious aspects, so decisions about what language to use are often far from being rational and are mainly cultural.  Indoctrination at the university plays a very important role. Recently they were instrumental in making Java a new Cobol.

The second important observation about programming languages is that language per se is just a tiny part of what can be called language programming environment. the latter includes libraries, IDE, books, level of adoption at universities,  popular, important applications written in the language, level of support and key players that support the language on major platforms such as Windows and Linux and other similar things.  A mediocre language with good programming environment can give a run for the money to similar superior in design languages that are just naked.  This is  a story behind success of  Java. Critical application is also very important and this is a story of success of PHP which is nothing but a bastardatized derivative of Perl (with all most interesting Perl features removed ;-) adapted to creation of dynamic web sites using so called LAMP stack.

Progress in programming languages has been very uneven and contain several setbacks. Currently this progress is mainly limited to development of so called scripting languages.  Traditional high level languages field is stagnant for many decades.

At the same time there are some mysterious, unanswered question about factors that help the language to succeed or fail. Among them:

Those are difficult questions to answer without some way of classifying languages into different categories. Several such classifications exists. First of all like with natural languages, the number of people who speak a given language is a tremendous force that can overcome any real of perceived deficiencies of the language. In programming languages, like in natural languages nothing succeed like success.

Complexity Curse

History of programming languages raises interesting general questions about the limit of complexity of programming languages. There is strong historical evidence that a language with simpler core, or even simplistic core Basic, Pascal) have better chances to acquire high level of popularity.  The underlying fact here probably is that most programmers are at best mediocre and such programmers tend on intuitive level to avoid more complex, more rich languages and prefer, say, Pascal to PL/1 and PHP to Perl. Or at least avoid it on a particular phase of language development (C++ is not simpler language then PL/1, but was widely adopted because of the progress of hardware, availability of compilers and not the least, because it was associated with OO exactly at the time OO became a mainstream fashion).  Complex non-orthogonal languages can succeed only as a result of a long period of language development (which usually adds complexly -- just compare Fortran IV with Fortran 99; or PHP 3 with PHP 5 ) from a smaller core. The banner of some fashionable new trend extending existing popular language to this new "paradigm" is also a possibility (OO programming in case of C++, which is a superset of C).

Historically, few complex languages were successful (PL/1, Ada, Perl, C++), but even if they were successful, their success typically was temporary rather then permanent  (PL/1, Ada, Perl). As Professor Wilkes noted   (iee90):

Things move slowly in the computer language field but, over a sufficiently long period of time, it is possible to discern trends. In the 1970s, there was a vogue among system programmers for BCPL, a typeless language. This has now run its course, and system programmers appreciate some typing support. At the same time, they like a language with low level features that enable them to do things their way, rather than the compiler’s way, when they want to.

They continue, to have a strong preference for a lean language. At present they tend to favor C in its various versions. For applications in which flexibility is important, Lisp may be said to have gained strength as a popular programming language.

Further progress is necessary in the direction of achieving modularity. No language has so far emerged which exploits objects in a fully satisfactory manner, although C++ goes a long way. ADA was progressive in this respect, but unfortunately it is in the process of collapsing under its own great weight.

ADA is an example of what can happen when an official attempt is made to orchestrate technical advances. After the experience with PL/1 and ALGOL 68, it should have been clear that the future did not lie with massively large languages.

I would direct the reader’s attention to Modula-3, a modest attempt to build on the appeal and success of Pascal and Modula-2 [12].

Complexity of the compiler/interpreter also matter as it affects portability: this is one thing that probably doomed PL/1 (and later Ada), although those days a new language typically come with open source compiler (or in case of scripting languages, an interpreter) and this is less of a problem.

Here is an interesting take on language design from the preface to The D programming language book:

Programming language design seeks power in simplicity and, when successful, begets beauty.

Choosing the trade-offs among contradictory requirements is a difficult task that requires good taste from the language designer as much as mastery of theoretical principles and of practical implementation matters. Programming language design is software-engineering-complete.

D is a language that attempts to consistently do the right thing within the constraints it chose: system-level access to computing resources, high performance, and syntactic similarity with C-derived languages. In trying to do the right thing, D sometimes stays with tradition and does what other languages do, and other times it breaks tradition with a fresh, innovative solution. On occasion that meant revisiting the very constraints that D ostensibly embraced. For example, large program fragments or indeed entire programs can be written in a well-defined memory-safe subset of D, which entails giving away a small amount of system-level access for a large gain in program debuggability.

You may be interested in D if the following values are important to you:

The role of fashion

At the initial, the most difficult stage of language development the language should solve an important problem that was inadequately solved by currently popular languages.  But at the same time the language has few chances rto cesseed unless it perfectly fits into the current software fashion. This "fashion factor" is probably as important as several other factors combined with the exclution of "language sponsor" factor.

Like in woman dress fashion rules in language design.  And with time this trend became more and more prononced.  A new language should simultaneously represent the current fashionable trend.  For example OO-programming was a visit card into the world of "big, successful languages" since probably early 90th (C++, Java, Python).  Before that "structured programming" and "verification" (Pascal, Modula) played similar role.

Programming environment and the role of "powerful sponsor" in language success

PL/1, Java, C#, Ada are languages that had powerful sponsors. Pascal, Basic, Forth are examples of the languages that had no such sponsor during the initial period of development.  C and C++ are somewhere in between.

But any language now need a "programming environment" which consists of a set of libraries, debugger and other tools (make tool, link, pretty-printer, etc). The set of standard" libraries and debugger are probably two most important elements. They cost  lot of time (or money) to develop and here the role of powerful sponsor is difficult to underestimate.

While this is not a necessary condition for becoming popular, it really helps: other things equal the weight of the sponsor of the language does matter. For example Java, being a weak, inconsistent language (C-- with garbage collection and OO) was pushed through the throat on the strength of marketing and huge amount of money spend on creating Java programming environment.  The same was partially true for  C# and Python. That's why Python, despite its "non-Unix" origin is more viable scripting language now then, say, Perl (which is better integrated with Unix and has pretty innovative for scripting languages support of pointers and regular expressions), or Ruby (which has support of coroutines form day 1, not as "bolted on" feature like in Python). Like in political campaigns, negative advertizing also matter. For example Perl suffered greatly from blackmail comparing programs in it with "white noise".   And then from withdrawal of O'Reilly from the role of sponsor of the language (although it continue to milk that Perl book publishing franchise ;-)

People proved to be pretty gullible and in this sense language marketing is not that different from woman clothing marketing :-)

Language level and success

One very important classification of programming languages is based on so called the level of the language.  Essentially after there is at least one language that is successful on a given level, the success of other languages on the same level became more problematic. Higher chances for success are for languages that have even slightly higher, but still higher level then successful predecessors.

The level of the language informally can be described as the number of statements (or, more correctly, the number of  lexical units (tokens)) needed to write a solution of a particular problem in one language versus another. This way we can distinguish several levels of programming languages:

 "Nanny languages" vs "Sharp razor" languages

Some people distinguish between "nanny languages" and "sharp razor" languages. The latter do not attempt to protect user from his errors while the former usually go too far... Right compromise is extremely difficult to find.

For example, I consider the explicit availability of pointers as an important feature of the language that greatly increases its expressive power and far outweighs risks of errors in hands of unskilled practitioners.  In other words attempts to make the language "safer" often misfire.

Expressive style of the languages

Another useful typology is based in expressive style of the language:

Those categories are not pure and somewhat overlap. For example, it's possible to program in an object-oriented style in C, or even assembler. Some scripting languages like Perl have built-in regular expressions engines that are a part of the language so they have functional component despite being procedural. Some relatively low level languages (Algol-style languages) implement garbage collection. A good example is Java. There are scripting languages that compile into common language framework which was designed for high level languages. For example, Iron Python compiles into .Net.

Weak correlation between quality of design and popularity

Popularity of the programming languages is not strongly connected to their quality. Some languages that look like a collection of language designer blunders (PHP, Java ) became quite popular. Java became especially a new Cobol and PHP dominates dynamic Web sites construction. The dominant technology for such Web sites is often called LAMP, which means Linux - Apache -My SQL PHP. Being a highly simplified but badly constructed subset of Perl, kind of new Basic for dynamic Web sites construction PHP provides the most depressing experience. I was unpleasantly surprised when I had learnt the Wikipedia engine was rewritten in PHP from Perl some time ago, but this quite illustrates the trend.

So language design quality has little to do with the language success in the marketplace. Simpler languages have more wide appeal as success of PHP (which at the beginning was at the expense of Perl) suggests. In addition much depends whether the language has powerful sponsor like was the case with Java (Sun and IBM) as well as Python (Google).

Progress in programming languages has been very uneven and contain several setbacks like Java. Currently this progress is usually associated with scripting languages. History of programming languages raises interesting general questions about "laws" of programming language design. First let's reproduce several notable quotes:

  1. Knuth law of optimization: "Premature optimization is the root of all evil (or at least most of it) in programming." - Donald Knuth
  2. "Greenspun's Tenth Rule of Programming: any sufficiently complicated C or Fortran program contains an ad hoc informally-specified bug-ridden slow implementation of half of Common Lisp." - Phil Greenspun
  3. "The key to performance is elegance, not battalions of special cases." - Jon Bentley and Doug McIlroy
  4. "Some may say Ruby is a bad rip-off of Lisp or Smalltalk, and I admit that. But it is nicer to ordinary people." - Matz, LL2
  5. Most papers in computer science describe how their author learned what someone else already knew. - Peter Landin
  6. "The only way to learn a new programming language is by writing programs in it." - Kernighan and Ritchie
  7. "If I had a nickel for every time I've written "for (i = 0; i < N; i++)" in C, I'd be a millionaire." - Mike Vanier
  8. "Language designers are not intellectuals. They're not as interested in thinking as you might hope. They just want to get a language done and start using it." - Dave Moon
  9. "Don't worry about what anybody else is going to do. The best way to predict the future is to invent it." - Alan Kay
  10. "Programs must be written for people to read, and only incidentally for machines to execute." - Abelson & Sussman, SICP, preface to the first edition

Please note that one thing is to read language manual and appreciate how good the concepts are, and another to bet your project on a new, unproved language without good debuggers, manuals and, what is very important, libraries. Debugger is very important but standard libraries are crucial: they represent a factor that makes or breaks new languages.

In this sense languages are much like cars. For many people car is the thing that they use get to work and shopping mall and they are not very interesting is engine inline or V-type and the use of fuzzy logic in the transmission. What they care is safety, reliability, mileage, insurance and the size of trunk. In this sense "Worse is better" is very true. I already mentioned the importance of the debugger. The other important criteria is quality and availability of libraries. Actually libraries are what make 80% of the usability of the language, moreover in a sense libraries are more important than the language...

A popular belief that scripting is "unsafe" or "second rate" or "prototype" solution is completely wrong. If a project had died than it does not matter what was the implementation language, so for any successful project and tough schedules scripting language (especially in dual scripting language+C combination, for example TCL+C) is an optimal blend that for a large class of tasks. Such an approach helps to separate architectural decisions from implementation details much better that any OO model does.

Moreover even for tasks that handle a fair amount of computations and data (computationally intensive tasks) such languages as Python and Perl are often (but not always !) competitive with C++, C# and, especially, Java.

The second important observation about programming languages is that language per se is just a tiny part of what can be called language programming environment. the latter includes libraries, IDE, books, level of adoption at universities, popular, important applications written in the language, level of support and key players that support the language on major platforms such as Windows and Linux and other similar things. A mediocre language with good programming environment can give a run for the money to similar superior in design languages that are just naked. This is a story behind success of Java. Critical application is also very important and this is a story of success of PHP which is nothing but a bastardatized derivative of Perl (with all most interesting Perl features removed ;-) adapted to creation of dynamic web sites using so called LAMP stack.

History of programming languages raises interesting general questions about the limit of complexity of programming languages. There is strong historical evidence that languages with simpler core, or even simplistic core has more chanced to acquire high level of popularity. The underlying fact here probably is that most programmers are at best mediocre and such programmer tend on intuitive level to avoid more complex, more rich languages like, say, PL/1 and Perl. Or at least avoid it on a particular phase of language development (C++ is not simpler language then PL/1, but was widely adopted because OO became a fashion). Complex non-orthogonal languages can succeed only as a result on long period of language development from a smaller core or with the banner of some fashionable new trend (OO programming in case of C++).

Programming Language Development Timeline

Here is modified from Byte the timeline of Programming Languages (for the original see BYTE.com September 1995 / 20th Anniversary /)

Forties

ca. 1946


1949

Fifties


1951


1952

1957


1958


1959

Sixties


1960


1962


1963


1964


1965


1966


1967



1969

Seventies


1970


1972


 

1974


1975


1976

 


1977


1978


1979

Eighties


1980


1981


1982


1983


1984


1985


1986


1987


1988


1989

Nineties


1990


1991


1992


1993


1994


1995


1996


1997


2006

2007 

2011


Top updates

Bulletin Latest Past week Past month
Google Search


NEWS CONTENTS

Old News ;-)

[Oct 14, 2011] Dennis Ritchie, 70, Dies, Programming Trailblazer - by Steve Rohr

October 13, 2011 | NYTimes.com
Dennis M. Ritchie, who helped shape the modern digital era by creating software tools that power things as diverse as search engines like Google and smartphones, was found dead on Wednesday at his home in Berkeley Heights, N.J. He was 70.

Mr. Ritchie, who lived alone, was in frail health in recent years after treatment for prostate cancer and heart disease, said his brother Bill.

In the late 1960s and early '70s, working at Bell Labs, Mr. Ritchie made a pair of lasting contributions to computer science. He was the principal designer of the C programming language and co-developer of the Unix operating system, working closely with Ken Thompson, his longtime Bell Labs collaborator.

The C programming language, a shorthand of words, numbers and punctuation, is still widely used today, and successors like C++ and Java build on the ideas, rules and grammar that Mr. Ritchie designed. The Unix operating system has similarly had a rich and enduring impact. Its free, open-source variant, Linux, powers many of the world's data centers, like those at Google and Amazon, and its technology serves as the foundation of operating systems, like Apple's iOS, in consumer computing devices.

"The tools that Dennis built - and their direct descendants - run pretty much everything today," said Brian Kernighan, a computer scientist at Princeton University who worked with Mr. Ritchie at Bell Labs.

Those tools were more than inventive bundles of computer code. The C language and Unix reflected a point of view, a different philosophy of computing than what had come before. In the late '60s and early '70s, minicomputers were moving into companies and universities - smaller and at a fraction of the price of hulking mainframes.

Minicomputers represented a step in the democratization of computing, and Unix and C were designed to open up computing to more people and collaborative working styles. Mr. Ritchie, Mr. Thompson and their Bell Labs colleagues were making not merely software but, as Mr. Ritchie once put it, "a system around which fellowship can form."

C was designed for systems programmers who wanted to get the fastest performance from operating systems, compilers and other programs. "C is not a big language - it's clean, simple, elegant," Mr. Kernighan said. "It lets you get close to the machine, without getting tied up in the machine."

Such higher-level languages had earlier been intended mainly to let people without a lot of programming skill write programs that could run on mainframes. Fortran was for scientists and engineers, while Cobol was for business managers.

C, like Unix, was designed mainly to let the growing ranks of professional programmers work more productively. And it steadily gained popularity. With Mr. Kernighan, Mr. Ritchie wrote a classic text, "The C Programming Language," also known as "K. & R." after the authors' initials, whose two editions, in 1978 and 1988, have sold millions of copies and been translated into 25 languages.

Dennis MacAlistair Ritchie was born on Sept. 9, 1941, in Bronxville, N.Y. His father, Alistair, was an engineer at Bell Labs, and his mother, Jean McGee Ritchie, was a homemaker. When he was a child, the family moved to Summit, N.J., where Mr. Ritchie grew up and attended high school. He then went to Harvard, where he majored in applied mathematics.

While a graduate student at Harvard, Mr. Ritchie worked at the computer center at the Massachusetts Institute of Technology, and became more interested in computing than math. He was recruited by the Sandia National Laboratories, which conducted weapons research and testing. "But it was nearly 1968," Mr. Ritchie recalled in an interview in 2001, "and somehow making A-bombs for the government didn't seem in tune with the times."

Mr. Ritchie joined Bell Labs in 1967, and soon began his fruitful collaboration with Mr. Thompson on both Unix and the C programming language. The pair represented the two different strands of the nascent discipline of computer science. Mr. Ritchie came to computing from math, while Mr. Thompson came from electrical engineering.

"We were very complementary," said Mr. Thompson, who is now an engineer at Google. "Sometimes personalities clash, and sometimes they meld. It was just good with Dennis."

Besides his brother Bill, of Alexandria, Va., Mr. Ritchie is survived by another brother, John, of Newton, Mass., and a sister, Lynn Ritchie of Hexham, England.

Mr. Ritchie traveled widely and read voraciously, but friends and family members say his main passion was his work. He remained at Bell Labs, working on various research projects, until he retired in 2007.

Colleagues who worked with Mr. Ritchie were struck by his code - meticulous, clean and concise. His writing, according to Mr. Kernighan, was similar. "There was a remarkable precision to his writing," Mr. Kernighan said, "no extra words, elegant and spare, much like his code."

[Mar 20, 2007] Fortran creator John Backus dies by Brian Bergstein

First FORTRAN compilers have pretty sophisticated optimization algorithms and generally much of compiler optimization research was done for Fortran compliers.
March 20, 2007 | MSNBC.com

John Backus, whose development of the Fortran programming language in the 1950s changed how people interacted with computers and paved the way for modern software, has died. He was 82.

Backus died Saturday in Ashland, Ore., according to IBM Corp., where he spent his career.

Prior to Fortran, computers had to be meticulously "hand-coded" - programmed in the raw strings of digits that triggered actions inside the machine. Fortran was a "high-level" programming language because it abstracted that work - it let programmers enter commands in a more intuitive system, which the computer would translate into machine code on its own.

The breakthrough earned Backus the 1977 Turing Award from the Association for Computing Machinery, one of the industry's highest accolades. The citation praised Backus' "profound, influential, and lasting contributions."

Backus also won a National Medal of Science in 1975 and got the 1993 Charles Stark Draper Prize, the top honor from the National Academy of Engineering.

"Much of my work has come from being lazy," Backus told Think, the IBM employee magazine, in 1979. "I didn't like writing programs, and so, when I was working on the IBM 701 (an early computer), writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."

John Warner Backus was born in Wilmington, Del., in 1924. His father was a chemist who became a stockbroker. Backus had what he would later describe as a "checkered educational career" in prep school and the University of Virginia, which he left after six months. After being drafted into the Army, Backus studied medicine but dropped it when he found radio engineering more compelling.

Backus finally found his calling in math, and he pursued a master's degree at Columbia University in New York. Shortly before graduating, Backus toured the IBM offices in midtown Manhattan and came across the company's Selective Sequence Electronic Calculator, an early computer stuffed with 13,000 vacuum tubes. Backus met one of the machine's inventors, Rex Seeber - who "gave me a little homemade test and hired me on the spot," Backus recalled in 1979.

Backus' early work at IBM included computing lunar positions on the balky, bulky computers that were state of the art in the 1950s. But he tired of hand-coding the hardware, and in 1954 he got his bosses to let him assemble a team that could design an easier system.

The result, Fortran, short for Formula Translation, reduced the number of programming statements necessary to operate a machine by a factor of 20.

It showed skeptics that machines could run just as efficiently without hand-coding. A wide range of programming languages and software approaches proliferated, although Fortran also evolved over the years and remains in use.

Backus remained with IBM until his retirement in 1991. Among his other important contributions was a method for describing the particular grammar of computer languages. The system is known as Backus-Naur Form.

© 2007 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Copyright 2007 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

[Mar 20, 2007] Slashdot/John W. Backus Dies at 82; Developed FORTRAN

A number of readers let us know of the passing of John W. Backus, who assembled a team to develop FORTRAN at IBM in the 1950s. It was the first widely used high-level language. Backus later worked on a "function-level" programming language, FP, which was described in his Turing Award lecture "Can Programming be Liberated from the von Neumann Style?" and is viewed as Backus's apology for creating FORTRAN. He received the 1977 ACM Turing Award "for profound, influential, and lasting contributions to the design of practical high-level programming systems, notably through his work on FORTRAN, and for seminal publication of formal procedures for the specification of programming languages."

Be afraid, be very afraid(Score:5, Insightful)

by vivaoporto (1064484) on Tuesday March 20, @05:48AM (#18411881)
(http://wsu.edu/~brians/errors/errors.html)

With both the lack of interest and the distortion of the original goal, Computer Science as we know may be dying with the elders. Computer Science originally had nothing to do with computers (as in personal computer) per se, but with the science of computation, optimal algorithms for pure math problems, etc. Actually, it was nothing but a branch of Math. The way computer science is being dealt with nowadays, with disdain, lack of interest and with people thinking about it as a tool to put another "screw tighter" professional in the market, soon we may run out of real breakthroughs like the ones those genius created to pave the yellow brick road we run over nowadays.

rest in peace

by dario_moreno (263767) on Tuesday March 20, @06:15AM (#18411991)
(http://www.mastermodelisation.com/ | Last Journal: Sunday April 03, @04:23PM)

Maybe it's because I was breastfed with BASIC from a very young age, but when I was forced to learn FORTRAN to work on legacy code I discovered after some initial, computer science taught disgust, that it was really the best way to express myself in code, better than with anything else, and I owe my present university position to FORTRAN because it made me so productive. I guess it was because the language was conceived by engineer, scientists oriented types, and not by formal logic adepts or grammar nazis. I still teach FORTRAN to this day, using F90/F95 in all its power, and MATLAB-like exposed students tend to enjoy it because they can develop simple and efficient numerical codes much faster than with anything else; some of them found positions thanks to it. The trick is to use FORTRAN for what it's for (numerical arrays, heavy linear algebra, easily parallelizable scientific computing) and not strings or files manipulation, linked lists (LISP) , graphics or system : for that there is C(++), and tons of libraries. If the code grows larger than 10 000 lines, very strong discipline is necessary, and that's where true OO can be pertinent. In scientific code FORTRAN tends to be 20% faster than the best possible C++ implementation because the grammar is so simple that compilers tend to understand better the code and can vectorize or optimize it much farther than C ; and there is much less overhead than with C++ because the objects are simpler to manipulate. Major code used in the industry (Star-CD, Gaussian for instance) is still written in FORTRAN for those (and legacy) reasons.

Re:rest in peace(Score:4, Insightful)

by Wormholio (729552) on Tuesday March 20, @08:04AM (#18412433)

I too still teach my students (in physics and astronomy) to use Fortran, for many of the reasons listed above. While it may also be useful for them to go on to learn other languages, their primary focus is on the physics problems they need to solve and the numerical algorithms needed to help them do that. Fortran makes it easy for them to get started and then focus on the calculations, not on grammar or philosophy.

Fortran has been criticized because you can write "spaghetti code" or other crap, while other languages supposedly protect you from the mistakes you can make in Fortran. But you can write crappy code in any language (including "spaghetti classes"). I teach my students to write with good style. They know their code has to be clearly understandable not just to the machine but also to someone else who is familiar with the goal of the code but not the details. Trying to enforce good style through grammar is misguided at best, just as it is in writing in general. Developing good style is a personal, ongoing process for writing anything, including good code.

About BNF notation

BNF is an acronym for "Backus Naur Form". John Backus and Peter Naur introduced for the first time a formal notation to describe the syntax of a given language (This was for the description of the ALGOL 60 programming language, see [Naur 60]). To be precise, most of BNF was introduced by Backus in a report presented at an earlier UNESCO conference on ALGOL 58. Few read the report, but when Peter Naur read it he was surprised at some of the differences he found between his and Backus's interpretation of ALGOL 58. He decided that for the successor to ALGOL, all participants of the first design had come to recognize some weaknesses, should be given in a similar form so that all participants should be aware of what they were agreeing to. He made a few modificiations that are almost universally used and drew up on his own the BNF for ALGOL 60 at the meeting where it was designed. Depending on how you attribute presenting it to the world, it was either by Backus in 59 or Naur in 60. (For more details on this period of programming languages history, see the introduction to Backus's Turing award article in Communications of the ACM, Vol. 21, No. 8, august 1978. This note was suggested by William B. Clodius from Los Alamos Natl. Lab).

[ Mar 04, 2007] Some contributions of Algol60 by Marc Rochkind (775756) March 04
(http://mudbag.com/)

1. The Report on the language used a formal syntax specification, one of the first, if not the first, to do so. Semantics were specified with prose, however.

2. There was a distinction between the publication language and the implementation language (those probably aren't the right terms). Among other things, it got around differences such as whether to use decimal points or commas in numeric constants.

3. Designed by a committee, rather than a private company or government agency.

4. Archetype of the so-called "Algol-like languages," examples of which are (were?) Pascal, PL./I, Algol68, Ada, C, and Java. (The term Algol-like languages is hardly used any more, since we have few examples of contemporary non-Algol-like languages.)

However, as someone who actually programmed in it (on a Univac 1108 in 1972 or 1973), I can say that Algol60 was extremely difficult to use for anything real, since it lacked string processing, data structures, adequate control flow constructs, and separate compilation. (Or so I recall... it's been a while since I've read the Report.)

Backus Normal Form vs. Backus Naur Form

The following exchange comes from a transcript given at the 1978 conference which the book documents:

CHEATHAM: The next question is from Bernie Galler of the University of Michigan, and he asks: "BNF is sometimes pronounced Backus-Naur-Form and sometimes Backus-Normal- Form. What was the original intention?

NAUR: I don't know where BNF came from in the first place. I don't know -- surely BNF originally meant Backus Normal Form. I don't know who suggested it. Perhaps Ingerman. [This is denied by Peter Z. Ingerman.] I don't know.

CHEATHAM: It was a suggestion that Peter Ingerman proposed then?

NAUR: ... Then the suggestion to change that I think was made by Don Knuth in a letter to the Communications of the ACM, and the justification -- well, he has the justification there. I think I made reference to it, so there you'll find whatever justification was originally made. That's all I would like to say.

[Dec 15, 2006] Ralph Griswold died Lambda the Ultimate

Ralph Griswold, the creator of Snobol and Icon programming languages, died in October 2006 of cancer. Until recently Computer Science was a discipline where the founders were still around. That's changing. Griswold was an important pioneer of programming language design with Snobol sting manipulation facilities different and somewhat faster then regular expressions.

Ralph Griswold died two weeks ago. He created several programming languages, most notably Snobol (in the 60s) and Icon (in the 70s) - both outstandingly innovative, integral, and efficacious in their areas. Despite the abundance of scripting and other languages today, Snobol and Icon are still unsurpassed in many respects, both as elegance of design and as practicality.

Ralph Griswold

See also Ralph Griswold 1934-2006 and Griswold Memorial Endowment

Ralph E. Griswold died in Tucson on October 4, 2006, of complications from pancreatic cancer. He was Regents Professor Emeritus in the Department of Computer Science at the University of Arizona.

Griswold was born in Modesto, California, in 1934. He was an award winner in the 1952 Westinghouse National Science Talent Search and went on to attend Stanford University, culminating in a PhD in Electrical Engineering in 1962.

Griswold joined the staff of Bell Telephone Laboratories in Holmdel, New Jersey, and rose to become head of Programming Research and Development. In 1971, he came to the University of Arizona to found the Department of Computer Science, and he served as department head through 1981. His insistence on high standards brought the department recognition and respect. In recognition of his work the university granted him the breastle of Regents Professor in 1990.

While at Bell Labs, Griswold led the design and implementation of the groundbreaking SNOBOL4 programming language with its emphasis on string manipulation and high-level data structures. At Arizona, he developed the Icon programming language, a high-level language whose influence can be seen in Python and other recent languages.

Griswold authored numerous books and articles about computer science. After retiring in 1997, his interests turned to weaving. While researching mathematical aspects of weaving design he collected and digitized a large library of weaving documents and maintained a public website. He published technical monographs and weaving designs that inspired the work of others, and he remained active until his final week.

-----Gregg Townsend Staff Scientist The University of Arizona

Programming Languages over the Years Julian Blake, IT/PDP

Abstract

This is a personal tour of programming languages which I have encountered since I first learnt to program a computer in 1959.

Assembly codes

Assembly languages have changed over the years, starting very simple (binary or decimal instruction codes and operands), growing to accept mnemonic operation codes and operands in place of numeric ones, and adding macro and conditional assembly features. Nowadays programming in assembly language is rarer than it used to be, and rather simple mnemonic assemblers, sufficient to process the output of a compiler, are used.

The English Electric DEUCE was programmed in binary (one 32-bit word to each row of a punched card). Each instruction had to specify the location of the next instruction to be obeyed, and the way to get a fast program was to place instructions in the mercury delay lines such that there was no unnecessary waiting between instructions. An interesting technical challenge was to write a bootstrap program of twelve instructions on a single punched card. During my time as a pre-university student at the English Electric Company, a staff member was in the process of writing an assembler which would, inter alia, look after instruction placement.

The EDSAC 2 and Titan (ATLAS 2) computers at Cambridge University had decimal assemblers. Both operation codes and operands were specified by decimal numbers. Both computers had built-in firmware to perform common tasks, for example conversion between binary and decimal. The order code of the Titan was designed so that it would be relatively easy to remember the decimal operation codes (the regularities in the order code showed most clearly in decimal).

The most powerful assembler with which I have worked was Control Data's COMPASS macro assembler for the CDC 6000 and 7000 Series central (CP) and peripheral (PP) processors. Operations and register operands were specified by mnemonics. CP instructions were written in a particularly user-friendly form: "SA1 A0+B1" denoted "set address register A1 to the sum of address register A0 and index register B1" (this initiated a read from central memory address A1, as reading was the function of registers A1 to A5). COMPASS was a classical two-pass assembler with macro and conditional assembly features, and generated a full listing showing both the source assembly code and the generated machine code (in octal). CDC's operating systems were written almost entirely in COMPASS assembly language. So too, at CERN, was the software for the Remote Input/Output Stations (RIOS, running on Modular 1 computers) and for SUPERMUX (terminal concentrators running on HP 2100 computers); the technique was to define a set of COMPASS macros for the target computer's order code, and to post-process the output of COMPASS to shorten 60-bit words to the word length of the target.

  • in BCPL for the ModComp computers of the CERNET packet switching network, and for the I8080, M6800 and TMS9900 microprocessors;

  • in Pascal for the M6800/6801/6809 and M68000/68020/68030+68881 microprocessors.

  • This list is not exhaustive. There were certainly others, for example the cross assemblers for the ESOP and XOP bit slice processors.

    Structured assembly languages

    Structured assembly languages provide a mixture of access to machine specific features such as registers and addressing modes, and high level language constructs such as begin-end blocks, procedures, if and case statements, and for and while loop constructs. The first such language was Niklaus Wirth's PL/360. At CERN, Robert Russell created PL-11 (for PDP-11) and later PL-VAX.

    I had brief contact with PL-VAX, and with the PL/M language for Intel microprocessors.

    Autocodes

    My first programming language was DEUCE Alphacode. The language provided a set of floating point variables (X1, X2, ..., X2200) and a smaller number of counting (integer) variables (N1, N2, ..., N63). One line of code could perform a single operation, for example "X1 = X2 + X3" or "X4 = ROOT X5". The statements of Alphacode were usually interpreted, not compiled. Writing and using an Alphacode program was an improvement on performing pre-specified calculations on an electro-mechanical calculating machine, my previous activity at English Electric.

    EDSAC 2 and Titan Autocode at Cambridge University supported operations on a fixed set of integer variables (I, J, ...) and of floating point arrays (A[5], B[7], ...). Autocode was a compiled language. Since I was interested in algorithms for syntactic analysis of programming languages at that time, the lack of arrays of integers ruled out the use of Autocode during my doctoral studies.

    Algol 60

    I was introduced to Algol 60 while working as a summer student at Elliott Automation, using the Elliott 803 computer. I remember being amazed when shown a recursive algorithm for the travelling salesman problem, and discovering that Algol did indeed allow a function to call itself.

    FORTRAN

    I first encountered FORTRAN IV when I arrived at CERN. How could a computer programmer live to the age of 26 years before using FORTRAN? The answer lies in the traditions of Cambridge University's Mathematical Laboratory, now renamed the Computer Laboratory. Shortly after I left Cambridge, the University bought an IBM computer for its computing service, thereby introducing FORTRAN.

    FORTRAN was the most important computer language at CERN, so much so that people used to say, only half jokingly, that whatever computer language physicists were using in five, ten or fifteen years time, it would be called FORTRAN. Only when CERN decided to move to C++ instead of making the transition from FORTRAN 77 to Fortran 90, did the use of FORTRAN decline.

    Although Pascal and C were the most frequently used high level languages on microprocessors at CERN, FORTRAN was used too. An extreme example was Hans von der Schmitt's RTF/68K (Real Time Fortran 77 for 68K Processors), whose compiler was written in a compiler description language which was translated into FORTRAN by a compiler generator, also written in FORTRAN.

    SYMPL

    SYMPL was an Algol-like systems implementation language created by Control Data for its 6000 and 7000 Series computers. I used it to write the CP (central processor) part of the 6000 Series software for CERNET; the PP (peripheral processor) part was written in COMPASS assembly language. Dietrich Wiegandt, writing CERNET software for the IBM System 370, had to write it all in assembly language. I was the more fortunate person.

    BCPL

    Martin Richards' BCPL is known as an ancestor of the C programming language. BCPL was used at CERN to write CERNET node software for ModComp II computers, and CERNET host software for PDP-11 and Nord 10. The BCPL compiler was written in BCPL. It ran at CERN as a native compiler for CDC 6000, IBM 370, VAX, PDP-11, Nord 10, HP 2100; and as a cross compiler for ModComp II and TMS9900. Cross assemblers, a linker, a librarian and pre-loaders ("pushers") were written in BCPL for ModComp, I8080, M6800 and TMS9900.

    Ada

    Ada 83 was tried at CERN by an ad hoc evaluation group, but never went much further. It was however used to write the error message module of the MODEL data acquisition system. One Ada critic felt that the inter-task rendezvous mechanism was unsuitable for real-time data acquisition systems. My own experience was that type checking was so strict that even carefully constructed code could be rejected by the compiler for no understandable reason.

    Pascal

    The second generation of CERN cross software for microprocessors was written in Pascal. This included cross assemblers for M6800/6801/6809 and M68K (M68000/68020/68030+68881), and M68K cross compilers for Pascal, Modula-2, FORTRAN 77 and C (unfinished). The FORTRAN cross compiler was little used as there were well established UNIX-derived f77 and C cross compilers. MoniCa, a debugging monitor for M68K, was written in Pascal and assembly language. MoniCa was used in the ALEPH event builder, in VALET-Plus test systems, and elsewhere. The Pascal cross compiler, which originated in Siemens research laboratories and was extended at CERN to become a multi-language compiler, was sufficiently good to see off competition from some commercial Pascal compilers. A few years later, however, DD divisional management decided that the time had come to stop developing microprocessor cross software.

    At the height of its use at CERN, Pascal was considered a safe language in which to write programs, whereas C and BCPL were considered dangerous.

    Modula-2

    Modula-2, Niklaus Wirth's successor to Pascal, was used on M68K hardware in the LEP control system.

    At about the same time, Robert Cailliau in PS division was promoting the use of his enhanced Pascal, P+.

    In both cases, necessary Pascal extensions included facilities for separate compilation and for bit operations.

    Modula-2 was capable of running on a microprocessor without operating system, since its SYSTEM module provided the minimum necessary for accepting interrupts and switching context.

    C

    C came to CERN with UNIX (on PDP-11 and then on VAX), and with the OS-9/68K operating system (on M68K) which was used in LEP experiments' data acquisition systems. It is now the dominant language for control and data acquisition systems. Much code which used to be written in assembly language is today written in C.

    The introduction into C of function prototypes, and the existence of instrumentation and debugging tools such as Purify and Insure++, has made programming in C less hazardous than it used to be.

    The GNU C compiler has set a high standard, below which no commercial C compiler is viable.

    C is the programming language with which I work most often today.

    C++

    C++ has taken over the role of FORTRAN as CERN's scientific data processing language. I have followed the development of C++ with interest, have helped to solve some problems with the use of C++, but have not become a practising C++ programmer. So take my views with a pinch of salt.

    The definition of C++ and its run time library has taken a long time to stabilize. Early users will have discovered how much the language and library have changed over the years. Some of the newer features are still not correctly implemented by current compilers. C++ is so powerful that a programmer can write programs which his colleagues, and perhaps he himself, cannot understand. So select the C++ features which you use with care, look at some of the generated code (it may be more than you expect), and use a good debugging tool (such as Insure++).

    Java

    I admire Java and its class libraries, have been to a Java programming course and read the nutshell book, but do not actually write Java programs today. So my views on Java may be worth even less than my views on C++. Let's simply say that I like Java better than C++, and I think its use will spread well beyond its current application area of Web-based human interfaces.

    Acknowledgments

    In this article I have touched on the work of too many CERN colleagues to mention every one of them by name. Ian Willers was the driving force behind the use of BCPL at CERN. Horst von Eicken drove the second generation of microprocessor support software. Jean Montuelle, David Foster and Jonathan Caves made major contributions to the microprocessor cross software.


    About the author(s): Julian Blake was the leader of the DD/SW/LM (languages and microprocessor support) section before it was dissolved.

    John Reid, convenor of the ISO Fortran standards committee, has posted the following announcement to some Fortran-related forums: 'I am pleased to tell you that the draft Fortran 2000 standard is now out for comment. ... The J3 (USA Fortran committee) version, which is identical except for the title page and the headers and footers, is available in ps, pdf, text, or source (latex). This is a very significant milestone for Fortran 2000. It is a major extension of Fortran 95 that has required a significant amount of development work by the J3. ... The abstract of the revision, which lists the major enhancements is appended. I have written an informal description of the new features, which will be published in the next issue of Fortran Forum (about to appear).'"

    Kristen Nygaard (1926-2002)

    Yet another one of the fathers of CS has passed away. :-( RIP.

    Edsger W. Dijkstra: 1930-2002

    Another language pioneer. RIP.

    How Object-Oriented Programming Started

    The History of Simula

    Comp.compilers Re XPL Language

    Re: XPL Language

    From comp.compilers

    From: [email protected] (Steve Meyer)
    Newsgroups: comp.compilers
    Date: 27 Aug 2000 22:27:57 -0400
    Organization: Compilers Central
    References: 00-06-118 00-07-016 00-07-075 00-08-018 00-08-028 00-08-055 00-08-083
    Keywords: history

    I am not sure it makes sense to continue this historial discussion,
    but I think there is a lot more to story. The roots of modern
    computing lie in this story. For example, although both PL360 and XPL
    were available why did Professor Knuth use assembler in his Art of
    Programming books? Also, these original languages (and the Bell Labs
    counter-parts) arose in Academic (School of Literate and Science)
    computer science departments, but now computing is studied in EE
    departments.

    On 13 Aug 2000 19:10:55 -0400, Duane Sand <[email protected]> wrote:
    >Steve Meyer wrote in message 00-08-055...
    >>>>>: Peter Flass <[email protected]> wrote:
    >>>>>: > XPL, developed in the 1970's was one of the earliest "compiler
    >>>>>: > compilers", was widely ported, and was the basis for a number of
    >other
    >>>>>: > languages such as the PL/M family.
    >>
    >>I think PL/M and XPL came from different worlds that did not
    >>communicate. I think people saw XPL as too high level. I think PL/M
    >>came from other system level languages such as PL/360 (?). My
    >>recollection may not be right.
    >
    >Niklaus Wirth developed PL360 as an alternative to writing IBM360
    >assembly code directly. It was a quick one-person project. The
    >parser used "operator precedence' techniques which predated practical
    >LR methods. The tables could be worked out by hand in no time but the
    >method couldn't handle BNFs of most languages. It was quite low
    >level, mapping infix syntactic forms directly to single 360

    Parsing may have gotten tenure for lots of professors but the most
    advanced programming language areas such as HDLs (hardware
    descriptions languages) now use the "predated" operator precedence
    methods. Also Professor Wirth's languages have remained at the
    for-front of academic programming languages.

    >instructions without any optimizations. The PL360 paper inspired lots
    >of people to develop their own small languages.

    I think PL360 was very popular within IBM and among the back then
    "modernist" movement away from assembly language. I know it was very
    popular at SLAC.

    >McKeeman etc developed XPL on 360 as a tidy subset of PL/I that could
    >be implemented by a few people and be useful in coding biggish things,
    >including the compiler and parser generator. The parser was initially
    >based on their extensions to operator precedence, which relaxed BNF
    >restrictions but required use of a parser generator tool and was still
    >limited compared to LR. XPL was "high level" only in having built-in
    >a varying-length string data type supported by a garbage collector.
    >There were no struct types.

    I think it was hard back then to differentiate XPL from Mckeeman's
    advocacy of Burroughs B5500 style stack machine research program.

    >Univ of Washington ported XPL onto SDS/Xerox systems that were like
    >360 but with one instruction format.
    >
    >UW graduate Gary Kildall developed Intel's first programming tools for
    >the 8008 and 8080, in trade for a very early portable computer: an
    >8008 without keyboard or monitor, installed in a briefcase. Kildall
    >used these (plus a floppy drive adapted by UW grad John Torode) to
    >develop CP/M, the precursor to MS DOS. The Intel tools included an
    >assembler and PL/M, both coded in Fortran. PL/M was inspired by the
    >example of PL360 and the implementation methods of XPL. Kildall left
    >before UW's XPL project but was likely very aware of it.
    >
    >PL/M's level was limited by the 8008's near inability to support proc
    >calls. The first micro language to see significant use was Basic,
    >implemented by assembler-coded interpreters. Implementing real
    >applications in real compiled languages required later chips with
    >nicer instruction sets, eg 8088 (gag) and M6800.

    As the Z80 showed, 8088 was only one index register away from being
    real computer.

    Historical question I think is why there was so little communciation
    between the current most popular BCPL, B, C, C++ research program and
    the Stanford/Silicon Valley research program.

    Just my two cents.
    /Steve
    --
    Steve Meyer Phone: (415) 296-7017
    Pragmatic C Software Corp. Fax: (415) 296-0946
    220 Montgomery St., Suite 925 email: [email protected]
    San Francisco, CA 94104

    FORTRANSIT -- the 650 Processor that made FORTRAN

    The FORTRANSIT story is covered in the Annals of Computing History [4, 5], but an additional and more informal slant doesn't hurt.

    Lisp History Web Page

    Computer simulation history: where to find references

    [Mar 3, 2006] ACM Press Release, March 01, 2006

    BTW John Backus extremely speculative 1977 ACM Turing award Lecture "Can Programming be liberated from the von Neumann Style? A Functional Style and its Algebra of Programs" can be found here. As E.W.Dijkstra noted "The article is a progress report on a valid research effort but suffers badly from aggressive overselling of its significance. This is the more regrettable as it has been published by way of Turing Award Lecture."
    From Slashdot: "It's interesting that Peter Naur is being recognized 40 years later, when another Algol team member, Alan Perlis, received the first Turing Award in 1966. Here's a photo of Perlis, Naur and the other Algol 1960 conference participants. [tugurium.com] ".

    Recommended Links

    Google matched content

    Softpanorama Recommended

    Top articles

    Sites

    Internal

    External

    Some views on programming, taken in a wide sense and regarded as a human activity, are presented. Accepting that programs will not only have to be designed and produced, but also modified so as to cater for changing demands, it is concluded that the proper, primary aim of programming is, not to produce programs, but to have the programmers build theories of the manner in which the problems at hand are solved by program execution. The implications of such a view of programming on matters such as program life and modification, system development methods, and the professional status of programmers, are discussed.



    Etc

    Society

    Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

    Quotes

    War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

    Bulletin:

    Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

    History:

    Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

    Classic books:

    The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

    Most popular humor pages:

    Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

    The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


    Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

    FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

    This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

    You can use PayPal to to buy a cup of coffee for authors of this site

    Disclaimer:

    The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

    Last modified: March 12, 2019