May the source be with you, but remember the KISS principle ;-)
Contents Bulletin Scripting in shell and Perl Network troubleshooting History Humor

Scriptorama: A Slightly Skeptical View
on Scripting Languages

News Introduction Recommended Books Recommended Links Programming Languages Design Recommended Papers Scripting Languages for Java vs. Pure Java
Software Engineering Anti-OO John Ousterhout Larry Wall Shell Giants Software Prototyping Software Life Cycle Models
Shells AWK Perl Perl Warts and Quirks Python PHP Javascript
Ruby Tcl/Tk R programming language Rexx Lua S-lang JVM-based scripting languages
Pipes Regex Program understanding Brooks law KISS Principle Software Prototyping Unix Component Model
Programming as a Profession Programming style Language design and programming quotes History Humor Random Findings Etc

This is the central page of the Softpanorama WEB site because I am strongly convinced that the development of scripting languages, not the replication of the efforts of BSD group undertaken by Stallman and Torvalds is the central part of open source. See Scripting languages as VHLL for more details.

Ordinarily technology changes fast. But programming languages are different: programming languages are not just technology, but what programmers think in.

They're half technology and half religion. And so the median language, meaning whatever language the median programmer uses, moves as slow as an iceberg.

Paul Graham: Beating the Averages

Libraries are more important that the language.

Donald Knuth


A fruitful way to think about language development is to consider it a to be special type of theory building. Peter Naur suggested that programming in general is theory building activity in his 1985 paper "Programming as Theory Building". But idea is especially applicable to compilers and interpreters. What Peter Naur failed to understand was that design of programming languages has religious overtones and sometimes represent an activity, which is pretty close to the process of creating a new, obscure cult ;-). Clueless academics publishing junk papers at obscure conferences are high priests of the church of programming languages. Some like Niklaus Wirth and Edsger W. Dijkstra (temporary) reached the status close of (false) prophets :-).

On a deep conceptual level building of a new language is a human way of solving complex problems. That means that complier construction in probably the most underappreciated paradigm of programming of large systems. Much more so then greatly oversold object-oriented programming. OO benefits are greatly overstated.

For users, programming languages distinctly have religious aspects, so decisions about what language to use are often far from being rational and are mainly cultural.  Indoctrination at the university plays a very important role. Recently they were instrumental in making Java a new Cobol.

The second important observation about programming languages is that language per se is just a tiny part of what can be called language programming environment. The latter includes libraries, IDE, books, level of adoption at universities,  popular, important applications written in the language, level of support and key players that support the language on major platforms such as Windows and Linux and other similar things. 

A mediocre language with good programming environment can give a run for the money to similar superior in design languages that are just naked.  This is  a story behind success of  Java and PHP. Critical application is also very important and this is a story of success of PHP which is nothing but a bastardatized derivative of Perl (with all the most interesting Perl features surgically removed ;-) adapted to creation of dynamic web sites using so called LAMP stack.

Progress in programming languages has been very uneven and contain several setbacks. Currently this progress is mainly limited to development of so called scripting languages.  Traditional high level languages field is stagnant for many decades.

At the same time there are some mysterious, unanswered question about factors that help the language to succeed or fail. Among them:

Those are difficult questions to answer without some way of classifying languages into different categories. Several such classifications exists. First of all like with natural languages, the number of people who speak a given language is a tremendous force that can overcome any real of perceived deficiencies of the language. In programming languages, like in natural languages nothing succeed like success.

The second interesting category is number of applications written in parcilar language that became part of Linux or, at least, are including in standard RHEL/FEDORA/CENTOS or Debian/Ubuntu repository.

The third relevant category is the number and quality of books for the particular language.

Complexity Curse

History of programming languages raises interesting general questions about the limit of complexity of programming languages. There is strong historical evidence that a language with simpler core, or even simplistic core Basic, Pascal) have better chances to acquire high level of popularity. 

The underlying fact here probably is that most programmers are at best mediocre and such programmers tend on intuitive level to avoid more complex, more rich languages and prefer, say, Pascal to PL/1 and PHP to Perl. Or at least avoid it on a particular phase of language development (C++ is not simpler language then PL/1, but was widely adopted because of the progress of hardware, availability of compilers and not the least, because it was associated with OO exactly at the time OO became a mainstream fashion). 

Complex non-orthogonal languages can succeed only as a result of a long period of language development (which usually adds complexly -- just compare Fortran IV with Fortran 99; or PHP 3 with PHP 5 ) from a smaller core. The banner of some fashionable new trend extending existing popular language to this new "paradigm" is also a possibility (OO programming in case of C++, which is a superset of C).

Historically, few complex languages were successful (PL/1, Ada, Perl, C++), but even if they were successful, their success typically was temporary rather then permanent  (PL/1, Ada, Perl). As Professor Wilkes noted   (iee90):

Things move slowly in the computer language field but, over a sufficiently long period of time, it is possible to discern trends. In the 1970s, there was a vogue among system programmers for BCPL, a typeless language. This has now run its course, and system programmers appreciate some typing support. At the same time, they like a language with low level features that enable them to do things their way, rather than the compiler’s way, when they want to.

They continue, to have a strong preference for a lean language. At present they tend to favor C in its various versions. For applications in which flexibility is important, Lisp may be said to have gained strength as a popular programming language.

Further progress is necessary in the direction of achieving modularity. No language has so far emerged which exploits objects in a fully satisfactory manner, although C++ goes a long way. ADA was progressive in this respect, but unfortunately it is in the process of collapsing under its own great weight.

ADA is an example of what can happen when an official attempt is made to orchestrate technical advances. After the experience with PL/1 and ALGOL 68, it should have been clear that the future did not lie with massively large languages.

I would direct the reader’s attention to Modula-3, a modest attempt to build on the appeal and success of Pascal and Modula-2 [12].

Complexity of the compiler/interpreter also matter as it affects portability: this is one thing that probably doomed PL/1 (and later Ada), although those days a new language typically come with open source compiler (or in case of scripting languages, an interpreter) and this is less of a problem.

Here is an interesting take on language design from the preface to The D programming language book:

Programming language design seeks power in simplicity and, when successful, begets beauty.

Choosing the trade-offs among contradictory requirements is a difficult task that requires good taste from the language designer as much as mastery of theoretical principles and of practical implementation matters. Programming language design is software-engineering-complete.

D is a language that attempts to consistently do the right thing within the constraints it chose: system-level access to computing resources, high performance, and syntactic similarity with C-derived languages. In trying to do the right thing, D sometimes stays with tradition and does what other languages do, and other times it breaks tradition with a fresh, innovative solution. On occasion that meant revisiting the very constraints that D ostensibly embraced. For example, large program fragments or indeed entire programs can be written in a well-defined memory-safe subset of D, which entails giving away a small amount of system-level access for a large gain in program debuggability.

You may be interested in D if the following values are important to you:

The role of fashion

At the initial, the most difficult stage of language development the language should solve an important problem that was inadequately solved by currently popular languages.  But at the same time the language has few chances to succeed unless it perfectly fits into the current software fashion. This "fashion factor" is probably as important as several other factors combined. With the notable exclusion of "language sponsor" factor.  The latter can make or break the language.

Like in woman dress fashion rules in language design.  And with time this trend became more and more pronounced.  A new language should simultaneously represent the current fashionable trend.  For example OO-programming was a visit card into the world of "big, successful languages" since probably early 90th (C++, Java, Python).  Before that "structured programming" and "verification" (Pascal, Modula) played similar role.

Programming environment and the role of "powerful sponsor" in language success

PL/1, Java, C#, Ada, Python are languages that had powerful sponsors. Pascal, Basic, Forth, partially Perl (O'Reilly was a sponsor for a short period of time)  are examples of the languages that had no such sponsor during the initial period of development.  C and C++ are somewhere in between.

But language itself is not enough. Any language now need a "programming environment" which consists of a set of libraries, debugger and other tools (make tool, lint, pretty-printer, etc). The set of standard" libraries and debugger are probably two most important elements. They cost  lot of time (or money) to develop and here the role of powerful sponsor is difficult to underestimate.

While this is not the necessary condition for becoming popular, it really helps: other things equal the weight of the sponsor of the language does matter. For example Java, being a weak, inconsistent language (C-- with garbage collection and OO) was pushed through the throat on the strength of marketing and huge amount of money spend on creating Java programming environment. 

The same was partially true for  C# and Python. That's why Python, despite its "non-Unix" origin is more viable scripting language now then, say, Perl (which is better integrated with Unix and has pretty innovative for scripting languages support of pointers and regular expressions), or Ruby (which has support of coroutines from day 1, not as "bolted on" feature like in Python).

Like in political campaigns, negative advertizing also matter. For example Perl suffered greatly from blackmail comparing programs in it with "white noise". And then from withdrawal of O'Reilly from the role of sponsor of the language (although it continue to milk that Perl book publishing franchise ;-)

People proved to be pretty gullible and in this sense language marketing is not that different from woman clothing marketing :-)

Language level and success

One very important classification of programming languages is based on so called the level of the language.  Essentially after there is at least one language that is successful on a given level, the success of other languages on the same level became more problematic. Higher chances for success are for languages that have even slightly higher, but still higher level then successful predecessors.

The level of the language informally can be described as the number of statements (or, more correctly, the number of  lexical units (tokens)) needed to write a solution of a particular problem in one language versus another. This way we can distinguish several levels of programming languages:

 "Nanny languages" vs "Sharp razor" languages

Some people distinguish between "nanny languages" and "sharp razor" languages. The latter do not attempt to protect user from his errors while the former usually go too far... Right compromise is extremely difficult to find.

For example, I consider the explicit availability of pointers as an important feature of the language that greatly increases its expressive power and far outweighs risks of errors in hands of unskilled practitioners.  In other words attempts to make the language "safer" often misfire.

Expressive style of the languages

Another useful typology is based in expressive style of the language:

Those categories are not pure and somewhat overlap. For example, it's possible to program in an object-oriented style in C, or even assembler. Some scripting languages like Perl have built-in regular expressions engines that are a part of the language so they have functional component despite being procedural. Some relatively low level languages (Algol-style languages) implement garbage collection. A good example is Java. There are scripting languages that compile into common language framework which was designed for high level languages. For example, Iron Python compiles into .Net.

Weak correlation between quality of design and popularity

Popularity of the programming languages is not strongly connected to their quality. Some languages that look like a collection of language designer blunders (PHP, Java ) became quite popular. Java became  a new Cobol and PHP dominates dynamic Web sites construction. The dominant technology for such Web sites is often called LAMP, which means Linux - Apache -MySQL- PHP. Being a highly simplified but badly constructed subset of Perl, kind of new Basic for dynamic Web sites construction PHP provides the most depressing experience. I was unpleasantly surprised when I had learnt that the Wikipedia engine was rewritten in PHP from Perl some time ago, but this fact quite illustrates the trend.

So language design quality has little to do with the language success in the marketplace. Simpler languages have more wide appeal as success of PHP (which at the beginning was at the expense of Perl) suggests. In addition much depends whether the language has powerful sponsor like was the case with Java (Sun and IBM) as well as Python (Google).

Progress in programming languages has been very uneven and contain several setbacks like Java. Currently this progress is usually associated with scripting languages. History of programming languages raises interesting general questions about "laws" of programming language design. First let's reproduce several notable quotes:

  1. Knuth law of optimization: "Premature optimization is the root of all evil (or at least most of it) in programming." - Donald Knuth
  2. "Greenspun's Tenth Rule of Programming: any sufficiently complicated C or Fortran program contains an ad hoc informally-specified bug-ridden slow implementation of half of Common Lisp." - Phil Greenspun
  3. "The key to performance is elegance, not battalions of special cases." - Jon Bentley and Doug McIlroy
  4. "Some may say Ruby is a bad rip-off of Lisp or Smalltalk, and I admit that. But it is nicer to ordinary people." - Matz, LL2
  5. Most papers in computer science describe how their author learned what someone else already knew. - Peter Landin
  6. "The only way to learn a new programming language is by writing programs in it." - Kernighan and Ritchie
  7. "If I had a nickel for every time I've written "for (i = 0; i < N; i++)" in C, I'd be a millionaire." - Mike Vanier
  8. "Language designers are not intellectuals. They're not as interested in thinking as you might hope. They just want to get a language done and start using it." - Dave Moon
  9. "Don't worry about what anybody else is going to do. The best way to predict the future is to invent it." - Alan Kay
  10. "Programs must be written for people to read, and only incidentally for machines to execute." - Abelson & Sussman, SICP, preface to the first edition

Please note that one thing is to read language manual and appreciate how good the concepts are, and another to bet your project on a new, unproved language without good debuggers, manuals and, what is very important, libraries. Debugger is very important but standard libraries are crucial: they represent a factor that makes or breaks new languages.

In this sense languages are much like cars. For many people car is the thing that they use get to work and shopping mall and they are not very interesting is engine inline or V-type and the use of fuzzy logic in the transmission. What they care is safety, reliability, mileage, insurance and the size of trunk. In this sense "Worse is better" is very true. I already mentioned the importance of the debugger. The other important criteria is quality and availability of libraries. Actually libraries are what make 80% of the usability of the language, moreover in a sense libraries are more important than the language...

A popular belief that scripting is "unsafe" or "second rate" or "prototype" solution is completely wrong. If a project had died than it does not matter what was the implementation language, so for any successful project and tough schedules scripting language (especially in dual scripting language+C combination, for example TCL+C) is an optimal blend that for a large class of tasks. Such an approach helps to separate architectural decisions from implementation details much better that any OO model does.

Moreover even for tasks that handle a fair amount of computations and data (computationally intensive tasks) such languages as Python and Perl are often (but not always !) competitive with C++, C# and, especially, Java.

The second important observation about programming languages is that language per se is just a tiny part of what can be called language programming environment. the latter includes libraries, IDE, books, level of adoption at universities, popular, important applications written in the language, level of support and key players that support the language on major platforms such as Windows and Linux and other similar things. A mediocre language with good programming environment can give a run for the money to similar superior in design languages that are just naked. This is a story behind success of Java. Critical application is also very important and this is a story of success of PHP which is nothing but a bastardatized derivative of Perl (with all most interesting Perl features removed ;-) adapted to creation of dynamic web sites using so called LAMP stack.

History of programming languages raises interesting general questions about the limit of complexity of programming languages. There is strong historical evidence that languages with simpler core, or even simplistic core has more chanced to acquire high level of popularity. The underlying fact here probably is that most programmers are at best mediocre and such programmer tend on intuitive level to avoid more complex, more rich languages like, say, PL/1 and Perl. Or at least avoid it on a particular phase of language development (C++ is not simpler language then PL/1, but was widely adopted because OO became a fashion). Complex non-orthogonal languages can succeed only as a result on long period of language development from a smaller core or with the banner of some fashionable new trend (OO programming in case of C++).

Programming Language Development Timeline

Here is modified from Byte the timeline of Programming Languages (for the original see September 1995 / 20th Anniversary /)


ca. 1946



















































Special note on Scripting languages

Scripting helps to avoid OO trap that is pushed by
  "a hoard of practically illiterate researchers
publishing crap papers in junk conferences."

Despite the fact that scripting languages are really important computer science phenomena, they are usually happily ignored in university curriculums.  Students are usually indoctrinated (or in less politically correct terms  "brainwashed")  in Java and OO programming ;-)

This site tries to give scripting languages proper emphasis and  promotes scripting languages as an alternative to mainstream reliance on "Java as a new Cobol" approach for software development. Please read my introduction to the topic that was recently converted into the article: A Slightly Skeptical View on Scripting Languages.  

The tragedy of scripting language designer is that there is no way to overestimate the level of abuse of any feature of the language.  Half of the programmers by definition is below average and it is this half that matters most in enterprise environment.  In a way the higher is the level of programmer, the less relevant for him are limitations of the language. That's why statements like "Perl is badly suitable for large project development" are plain vanilla silly. With proper discipline it is perfectly suitable and programmers can be more productive with Perl than with Java. The real question is "What is the team quality and quantity?".  

Scripting is a part of Unix cultural tradition and Unix was the initial development platform for most of mainstream scripting languages with the exception of REXX. But they are portable and now all can be used in Windows and other OSes.

List of Softpanorama pages related to scripting languages

Standard topics

Main Representatives of the family Related topics History Etc

Different scripting languages provide different level of integration with base OS API (for example, Unix or Windows). For example Iron Python compiles into .Net and provides pretty high level of integration with Windows. The same is true about Perl and Unix: almost all Unix system calls are available directly from Perl. Moreover Perl integrates most of Unix API in a very natural way, making it perfect replacement of shell for coding complex scripts. It also have very good debugger. The latter is weak point of shells like bash and ksh93

Unix proved that treating everything like a file is a powerful OS paradigm. In a similar way scripting languages proved that "everything is a string" is also an extremely powerful programming paradigm.

Unix proved that treating everything like a file is a powerful OS paradigm. In a similar way scripting languages proved that "everything is a string" is also extremely powerful programming paradigm.

There are also several separate pages devoted to scripting in different applications. The main emphasis is on shells and Perl. Right now I am trying to convert my old Perl lecture notes into a eBook Nikolai Bezroukov. Introduction to Perl for Unix System Administrators.

Along with pages devoted to major scripting languages this site has many pages devoted to scripting in different applications.  There are more then a dozen of "Perl/Scripting tools for a particular area" type of pages. The most well developed and up-to-date pages of this set are probably Shells and Perl. This page main purpose is to follow the changes in programming practices that can be called the "rise of  scripting," as predicted in the famous John Ousterhout article Scripting: Higher Level Programming for the 21st Century in IEEE COMPUTER (1998). In this brilliant paper he wrote:

...Scripting languages such as Perl and Tcl represent a very different style of programming than system programming languages such as C or Java. Scripting languages are designed for "gluing" applications; they use typeless approaches to achieve a higher level of programming and more rapid application development than system programming languages. Increases in computer speed and changes in the application mix are making scripting languages more and more important for applications of the future.

...Scripting languages and system programming languages are complementary, and most major computing platforms since the 1960's have provided both kinds of languages. The languages are typically used together in component frameworks, where components are created with system programming languages and glued together with scripting languages. However, several recent trends, such as faster machines, better scripting languages, the increasing importance of graphical user interfaces and component architectures, and the growth of the Internet, have greatly increased the applicability of scripting languages. These trends will continue over the next decade, with more and more new applications written entirely in scripting languages and system programming languages used primarily for creating components.

My e-book Portraits of Open Source Pioneers contains several chapters on scripting (most are in early draft stage) that expand on this topic. 

The reader must understand that the treatment of the scripting languages in press, and especially academic press is far from being fair: entrenched academic interests often promote old or commercially supported paradigms until they retire, so change of paradigm often is possible only with the change of generations. And people tend to live longer those days... Please also be aware that even respectable academic magazines like Communications of ACM and IEEE Software often promote "Cargo cult software engineering" like Capability Maturity (CMM) model.

Dr. Nikolai Bezroukov

Top Visited
Past week
Past month


Old News ;-)

2007 2006 2005 2004 2003 2002 2001 2000 1999

[Sep 18, 2017] The Fall Of Perl, The Webs Most Promising Language by Conor Myhrvold

The author is way too superficial to say something of importance. he is just a syntax junkie. The real story is that Python has less steep initial leaning curve and tht helped to entenched it in universities. the rest is history. Google support also was a positive factor. Python also basked in OO hype.
Notable quotes:
"... Children of Men ..."
"... Chinese Democracy ..."
Jan 13, 2014 |

And the rise of Python. Does Perl have a future?

I first heard of Perl when I was in middle school in the early 2000s. It was one of the world's most versatile programming languages, dubbed the Swiss army knife of the Internet. But compared to its rival Python, Perl has faded from popularity. What happened to the web's most promising language? Perl's low entry barrier compared to compiled, lower level language alternatives (namely, C) meant that Perl attracted users without a formal CS background (read: script kiddies and beginners who wrote poor code). It also boasted a small group of power users ("hardcore hackers") who could quickly and flexibly write powerful, dense programs that fueled Perl's popularity to a new generation of programmers.

A central repository (the Comprehensive Perl Archive Network, or CPAN ) meant that for every person who wrote code, many more in the Perl community (the Programming Republic of Perl ) could employ it. This, along with the witty evangelism by eclectic creator Larry Wall , whose interest in language ensured that Perl led in text parsing, was a formula for success during a time in which lots of text information was spreading over the Internet.

As the 21st century approached, many pearls of wisdom were wrought to move and analyze information on the web. Perl did have a learning curve–often meaning that it was the third or fourth language learned by adopters–but it sat at the top of the stack.

"In the race to the millennium, it looks like C++ will win, Java will place, and Perl will show," Wall said in the third State of Perl address in 1999. "Some of you no doubt will wish we could erase those top two lines, but I don't think you should be unduly concerned. Note that both C++ and Java are systems programming languages. They're the two sports cars out in front of the race. Meanwhile, Perl is the fastest SUV, coming up in front of all the other SUVs. It's the best in its class. Of course, we all know Perl is in a class of its own."

Then came the upset.

The Perl vs. Python Grudge Match

Then Python came along. Compared to Perl's straight-jacketed scripting, Python was a lopsided affair. It even took after its namesake, Monty Python's Flying Circus. Fittingly, most of Wall's early references to Python were lighthearted jokes at its expense. Well, the millennium passed, computers survived Y2K , and my teenage years came and went. I studied math, science, and humanities but kept myself an arm's distance away from typing computer code. My knowledge of Perl remained like the start of a new text file: cursory , followed by a lot of blank space to fill up.

In college, CS friends at Princeton raved about Python as their favorite language (in spite of popular professor Brian Kernighan on campus, who helped popularize C). I thought Python was new, but I later learned it was around when I grew up as well, just not visible on the charts.

By the late 2000s Python was not only the dominant alternative to Perl for many text parsing tasks typically associated with Perl (i.e. regular expressions in the field of bioinformatics ) but it was also the most proclaimed popular language , talked about with elegance and eloquence among my circle of campus friends, who liked being part of an up-and-coming movement.

Side By Side Comparison: Binary Search

Despite Python and Perl's well documented rivalry and design decision differences–which persist to this day–they occupy a similar niche in the programming ecosystem. Both are frequently referred to as "scripting languages," even though later versions are retro-fitted with object oriented programming (OOP) capabilities.

Stylistically, Perl and Python have different philosophies. Perl's best known mottos is " There's More Than One Way to Do It ". Python is designed to have one obvious way to do it. Python's construction gave an advantage to beginners: A syntax with more rules and stylistic conventions (for example, requiring whitespace indentations for functions) ensured newcomers would see a more consistent set of programming practices; code that accomplished the same task would look more or less the same. Perl's construction favors experienced programmers: a more compact, less verbose language with built-in shortcuts which made programming for the expert a breeze.

During the dotcom era and the tech recovery of the mid to late 2000s, high-profile websites and companies such as Dropbox (Python) and Amazon and Craigslist (Perl), in addition to some of the world's largest news organizations ( BBC , Perl ) used the languages to accomplish tasks integral to the functioning of doing business on the Internet. But over the course of the last 15 years , not only how companies do business has changed and grown, but so have the tools they use to have grown as well, unequally to the detriment of Perl. (A growing trend that was identified in the last comparison of the languages, " A Perl Hacker in the Land of Python ," as well as from the Python side a Pythonista's evangelism aggregator , also done in the year 2000.) Perl's Slow Decline

Today, Perl's growth has stagnated. At the Orlando Perl Workshop in 2013, one of the talks was titled " Perl is not Dead, It is a Dead End ," and claimed that Perl now existed on an island. Once Perl programmers checked out, they always left for good, never to return. Others point out that Perl is left out of the languages to learn first –in an era where Python and Java had grown enormously, and a new entrant from the mid-2000s, Ruby, continues to gain ground by attracting new users in the web application arena (via Rails ), followed by the Django framework in Python (PHP has remained stable as the simplest option as well).

In bioinformatics, where Perl's position as the most popular scripting language powered many 1990s breakthroughs like genetic sequencing, Perl has been supplanted by Python and the statistical language R (a variant of S-plus and descendent of S , also developed in the 1980s).

In scientific computing, my present field, Python, not Perl, is the open source overlord, even expanding at Matlab's expense (also a child of the 1980s , and similarly retrofitted with OOP abilities ). And upstart PHP grew in size to the point where it is now arguably the most common language for web development (although its position is dynamic, as Ruby and Python have quelled PHP's dominance and are now entrenched as legitimate alternatives.)

While Perl is not in danger of disappearing altogether, it is in danger of losing cultural relevance , an ironic fate given Wall's love of language. How has Perl become the underdog, and can this trend be reversed? (And, perhaps more importantly, will Perl 6 be released!?)

How I Grew To Love Python

Why Python , and not Perl? Perhaps an illustrative example of what happened to Perl is my own experience with the language. In college, I still stuck to the contained environments of Matlab and Mathematica, but my programming perspective changed dramatically in 2012. I realized lacking knowledge of structured computer code outside the "walled garden" of a desktop application prevented me from fully simulating hypotheses about the natural world, let alone analyzing data sets using the web, which was also becoming an increasingly intellectual and financially lucrative skill set.

One year after college, I resolved to learn a "real" programming language in a serious manner: An all-in immersion taking me over the hump of knowledge so that, even if I took a break, I would still retain enough to pick up where I left off. An older alum from my college who shared similar interests–and an experienced programmer since the late 1990s–convinced me of his favorite language to sift and sort through text in just a few lines of code, and "get things done": Perl. Python, he dismissed, was what "what academics used to think." I was about to be acquainted formally.

Before making a definitive decision on which language to learn, I took stock of online resources, lurked on PerlMonks , and acquired several used O'Reilly books, the Camel Book and the Llama Book , in addition to other beginner books. Yet once again, Python reared its head , and even Perl forums and sites dedicated to the language were lamenting the digital siege their language was succumbing to . What happened to Perl? I wondered. Ultimately undeterred, I found enough to get started (quality over quantity, I figured!), and began studying the syntax and working through examples.

But it was not to be. In trying to overcome the engineered flexibility of Perl's syntax choices, I hit a wall. I had adopted Perl for text analysis, but upon accepting an engineering graduate program offer, switched to Python to prepare.

By this point, CPAN's enormous advantage had been whittled away by ad hoc, hodgepodge efforts from uncoordinated but overwhelming groups of Pythonistas that now assemble in Meetups , at startups, and on college and corporate campuses to evangelize the Zen of Python . This has created a lot of issues with importing ( pointed out by Wall ), and package download synchronizations to get scientific computing libraries (as I found), but has also resulted in distributions of Python such as Anaconda that incorporate the most important libraries besides the standard library to ease the time tariff on imports.

As if to capitalize on the zeitgiest, technical book publisher O'Reilly ran this ad , inflaming Perl devotees.

By 2013, Python was the language of choice in academia, where I was to return for a year, and whatever it lacked in OOP classes, it made up for in college classes. Python was like Google, who helped spread Python and employed van Rossum for many years. Meanwhile, its adversary Yahoo (largely developed in Perl ) did well, but comparatively fell further behind in defining the future of programming. Python was the favorite and the incumbent; roles had been reversed.

So after six months of Perl-making effort, this straw of reality broke the Perl camel's back and caused a coup that overthrew the programming Republic which had established itself on my laptop. I sheepishly abandoned the llama . Several weeks later, the tantalizing promise of a new MIT edX course teaching general CS principles in Python, in addition to numerous n00b examples , made Perl's syntax all too easy to forget instead of regret.

Measurements of the popularity of programming languages, in addition to friends and fellow programming enthusiasts I have met in the development community in the past year and a half, have confirmed this trend, along with the rise of Ruby in the mid-2000s, which has also eaten away at Perl's ubiquity in stitching together programs written in different languages.

While historically many arguments could explain away any one of these studies–perhaps Perl programmers do not cheerlead their language as much, since they are too busy productively programming. Job listings or search engine hits could mean that a programming language has many errors and issues with it, or that there is simply a large temporary gap between supply and demand.

The concomitant picture, and one that many in the Perl community now acknowledge, is that Perl is now essentially a second-tier language, one that has its place but will not be the first several languages known outside of the Computer Science domain such as Java, C, or now Python.

The Future Of Perl (Yes, It Has One)

I believe Perl has a future , but it could be one for a limited audience. Present-day Perl is more suitable to users who have worked with the language from its early days , already dressed to impress . Perl's quirky stylistic conventions, such as using $ in front to declare variables, are in contrast for the other declarative symbol $ for practical programmers today–the money that goes into the continued development and feature set of Perl's frenemies such as Python and Ruby. And the high activation cost of learning Perl, instead of implementing a Python solution. Ironically, much in the same way that Perl jested at other languages, Perl now finds itself at the receiving end . What's wrong with Perl , from my experience? Perl's eventual problem is that if the Perl community cannot attract beginner users like Python successfully has, it runs the risk of become like Children of Men , dwindling away to a standstill; vast repositories of hieroglyphic code looming in sections of the Internet and in data center partitions like the halls of the Mines of Moria . (Awe-inspiring and historical? Yes. Lively? No.)

Perl 6 has been an ongoing development since 2000. Yet after 14 years it is not officially done , making it the equivalent of Chinese Democracy for Guns N' Roses. In Larry Wall's words : "We're not trying to make Perl a better language than C++, or Python, or Java, or JavaScript. We're trying to make Perl a better language than Perl. That's all." Perl may be on the same self-inflicted path to perfection as Axl Rose, underestimating not others but itself. "All" might still be too much.

Absent a game-changing Perl release (which still could be "too little, too late") people who learn to program in Python have no need to switch if Python can fulfill their needs, even if it is widely regarded as second or third best in some areas. The fact that you have to import a library, or put up with some extra syntax, is significantly easier than the transactional cost of learning a new language and switching to it. So over time, Python's audience stays young through its gateway strategy that van Rossum himself pioneered, Computer Programming for Everybody . (This effort has been a complete success. For example, at MIT Python replaced Scheme as the first language of instruction for all incoming freshman, in the mid-2000s.)

Python Plows Forward

Python continues to gain footholds one by one in areas of interest, such as visualization (where Python still lags behind other language graphics, like Matlab, Mathematica, or the recent d3.js ), website creation (the Django framework is now a mainstream choice), scientific computing (including NumPy/SciPy), parallel programming (mpi4py with CUDA), machine learning, and natural language processing (scikit-learn and NLTK) and the list continues.

While none of these efforts are centrally coordinated by van Rossum himself, a continually expanding user base, and getting to CS students first before other languages (such as even Java or C), increases the odds that collaborations in disciplines will emerge to build a Python library for themselves, in the same open source spirit that made Perl a success in the 1990s.

As for me? I'm open to returning to Perl if it can offer me a significantly different experience from Python (but "being frustrating" doesn't count!). Perhaps Perl 6 will be that release. However, in the interim, I have heeded the advice of many others with a similar dilemma on the web. I'll just wait and C .

[Sep 03, 2017] When To Choose Python Over Perl (And Vice Versa)
When to use perl:

When to use python:

Edit after 5 years: gravatar for Istvan Albert 5.8 years ago by Istvan Albert ♦♦ 73k University Park, USA Istvan Albert ♦♦ 73k wrote:

I think very few people are completely agnostic about programming languages especially when it comes to languages with very similar strengths and weaknesses like: perl/python/ruby - therefore there is no general reason for using one language vs the other.

It is more common to find someone equally proficient in C and perl, than say equally proficient in perl and python. My guess would be that complementary languages require complementary skill sets that occupy different parts of the brain, whereas similar concepts will clash more easily.

ADD COMMENT • link modified 3.3 years ago • written 5.8 years ago by Istvan Albert ♦♦ 73k 7 gravatar for Michael Dondrup 5.8 years ago by Michael Dondrup42k Bergen, Norway Michael Dondrup42k wrote:

You are totally correct with your initial assumption. This question is similar to choosing between Spanish and English, which language to choose? Well if you go to Spain,...

All (programming) languages are equal, in the sense of that you can solve the same class of problems with them. Once you know one language, you can easily learn all imperative languages. Use the language that you already master or that suits your style. Both (Perl & Python) are interpreted languages and have their merits. Both have extensive Bio-libraries, and both have large archives of contributed packages.

An important criterion to decide is the availability of rich, stable, and well maintained libraries. Choose the language that provides the library you need. For example, if you want to program web-services (using SOAP not web-sites), you better use Java or maybe C#.

Conclusion: it does no harm to learn new languages. And no flame wars please.

ADD COMMENT • link modified 5.8 years ago • written 5.8 years ago by Michael Dondrup42k
Sep 03, 2017 |

[Sep 03, 2017] Which is better, Perl or Python Which one is more robuts How do they compare with each other - Quora
27 Answers Dan Lenski Dan Lenski , I do all my best thinking in Python, and I've written a lot of it Updated Jun 1 2015 Though I may get flamed for it, I will put it even more bluntly than others have: Python is better than Perl . Python's syntax is cleaner, its object-oriented type system is more modern and consistent, and its libraries are more consistent. ( EDIT: As Christian Walde points out in the comments, my criticism of Perl OOP is out-of-date with respect to the current de facto standard of Moo/se. I do believe that Perl's utility is still encumbered by historical baggage in this area and others.)

I have used both languages extensively for both professional work and personal projects (Perl mainly in 1999-2007, Python mainly since), in domains ranging from number crunching (both PDL and NumPy are excellent) to web-based programming (mainly with Embperl and Flask ) to good ol' munging text files and database CRUD

Both Python and Perl have large user communities including many programmers who are far more skilled and experienced than I could ever hope to be. One of the best things about Python is that the community generally espouses this aspect of "The Zen of Python"

Python's philosophy rejects the Perl " there is more than one way to do it " approach to language design in favor of "there should be one!and preferably only one!obvious way to do it".

... while this principle might seem stultifying or constraining at first, in practice it means that most good Python programmers think about the principle of least surprise and make it easier for others to read and interface with their code.

In part as a consequence of this discipline, and definitely because of Python's strong typing , and arguably because of its "cleaner" syntax, Python code is considerably easier to read than Perl code. One of the main events that motivated me to switch was the experience of writing code to automate lab equipment in grad school. I realized I couldn't read Perl code I'd written several months prior, despite the fact that I consider myself a careful and consistent programmer when it comes to coding style ( Dunning–Kruger effect , perhaps? :-P).

One of the only things I still use Perl for occasionally is writing quick-and-dirty code to manipulate text strings with regular expressions; Sai Janani Ganesan's pointed out Perl's value for this as well . Perl has a lot of nice syntactic sugar and command line options for munging text quickly*, and in fact there's one regex idiom I used a lot in Perl and for which I've never found a totally satisfying substitute in Python

* For example, the one-liner perl bak pe 's/foo/bar/g' *. txt will go through a bunch of text files and replace foo with bar everywhere while making backup files with the bak extension.
Sep 03, 2017 |

[Sep 03, 2017] Perl-Python-Ruby Comparison
What is the Josephus problem? To quote from Concepts, Techniques, and Models of Computer Programming (a daunting title if ever there was one):

Flavius Josephus was a roman historian of Jewish origin. During the Jewish-Roman wars of the first century AD, he was in a cave with fellow soldiers, 40 men in all, surrounded by enemy Roman troops. They decided to commit suicide by standing in a ring and counting off each third man. Each man so designated was to commit suicide...Josephus, not wanting to die, managed to place himself in the position of the last survivor.

In the general version of the problem, there are n soldiers numbered from 1 to n and each k -th soldier will be eliminated. The count starts from the first soldier. What is the number of the last survivor?

I decided to model this situation using objects in three different scripting languages, Perl, Ruby, and Python. The solution in each of the languages is similar. A Person class is defined, which knows whether it is alive or dead, who the next person in the circle is, and what position number it is in. There are methods to pass along a kill signal, and to create a chain of people. Either of these could have been implemented using iteration, but I wanted to give recursion a whirl, since it's tougher on the languages. Here are my results.
Sep 03, 2017 |

[Jul 10, 2017] Crowdsourcing, Open Data and Precarious Labour by Allana Mayer Model View Culture

Notable quotes:
"... Photo CC-BY Mace Ojala. ..."
"... Photo CC-BY Samantha Marx. ..."
Jul 10, 2017 |
Crowdsourcing, Open Data and Precarious Labour Crowdsourcing and microtransactions are two halves of the same coin: they both mark new stages in the continuing devaluation of labour. by Allana Mayer on February 24th, 2016 The cultural heritage industries (libraries, archives, museums, and galleries, often collectively called GLAMs) like to consider themselves the tech industry's little siblings. We're working to develop things like Linked Open Data, a decentralized network of collaboratively-improved descriptive metadata; we're building our own open-source tech to make our catalogues and collections more useful; we're pushing scholarly publishing out from behind paywalls and into open-access platforms; we're driving innovations in accessible tech.

We're only different in a few ways. One, we're a distinctly feminized set of professions , which comes with a large set of internally- and externally-imposed assumptions. Two, we rely very heavily on volunteer labour, and not just in the internship-and-exposure vein : often retirees and non-primary wage-earners are the people we "couldn't do without." Three, the underlying narrative of a "helping" profession ! essentially a social service ! can push us to ignore the first two distinctions, while driving ourselves to perform more and expect less.

I suppose the major way we're different is that tech doesn't acknowledge us, treat us with respect, build things for us, or partner with us, unless they need a philanthropic opportunity. Although, when some ingenue autodidact bootstraps himself up to a billion-dollar IPO, there's a good chance he's been educating himself using our free resources. Regardless, I imagine a few of the issues true in GLAMs are also true in tech culture, especially in regards to labour and how it's compensated.


Notecards in a filing drawer: old-fashioned means of recording metadata.

Photo CC-BY Mace Ojala.

Here's an example. One of the latest trends is crowdsourcing: admitting we don't have all the answers, and letting users suggest some metadata for our records. (Not to be confused with crowdfunding.) The biggest example of this is Flickr Commons: the Library of Congress partnered with Yahoo! to publish thousands of images that had somehow ended up in the LOC's collection without identifying information. Flickr users were invited to tag pictures with their own keywords or suggest descriptions using comments.

Many orphaned works (content whose copyright status is unclear) found their way conclusively out into the public domain (or back into copyright) this way. Other popular crowdsourcing models include gamification , transcription of handwritten documents (which can't be done with Optical Character Recognition), or proofreading OCR output on digitized texts. The most-discussed side benefits of such projects include the PR campaign that raises general awareness about the organization, and a "lifting of the curtain" on our descriptive mechanisms.

The problem with crowdsourcing is that it's been conclusively proven not to function in the way we imagine it does: a handful of users end up contributing massive amounts of labour, while the majority of those signed up might do a few tasks and then disappear. Seven users in the "Transcribe Bentham" project contributed to 70% of the manuscripts completed; 10 "power-taggers" did the lion's share of the Flickr Commons' image-identification work. The function of the distributed digital model of volunteerism is that those users won't be compensated, even though many came to regard their accomplishments as full-time jobs .

It's not what you're thinking: many of these contributors already had full-time jobs , likely ones that allowed them time to mess around on the Internet during working hours. Many were subject-matter experts, such as the vintage-machinery hobbyist who created entire datasets of machine-specific terminology in the form of image tags. (By the way, we have a cute name for this: "folksonomy," a user-built taxonomy. Nothing like reducing unpaid labour to a deeply colonial ascription of communalism.) In this way, we don't have precisely the free-labour-for-exposure/project-experience problem the tech industry has ; it's not our internships that are the problem. We've moved past that, treating even our volunteer labour as a series of microtransactions. Nobody's getting even the dubious benefit of job-shadowing, first-hand looks at business practices, or networking. We've completely obfuscated our own means of production. People who submit metadata or transcriptions don't even have a means of seeing how the institution reviews and ingests their work, and often, to see how their work ultimately benefits the public.

All this really says to me is: we could've hired subject experts to consult, and given them a living wage to do so, instead of building platforms to dehumanize labour. It also means our systems rely on privilege , and will undoubtedly contain and promote content with a privileged bias, as Wikipedia does. (And hey, even Wikipedia contributions can sometimes result in paid Wikipedian-in-Residence jobs.)

For example, the Library of Congress's classification and subject headings have long collected books about the genocide of First Nations peoples during the colonization of North America under terms such as "first contact," "discovery and exploration," "race relations," and "government relations." No "subjugation," "cultural genocide," "extermination," "abuse," or even "racism" in sight. Also, the term "homosexuality" redirected people to "sexual perversion" up until the 1970s. Our patrons are disrespected and marginalized in the very organization of our knowledge.

If libraries continue on with their veneer of passive and objective authorities that offer free access to all knowledge, this underlying bias will continue to propagate subconsciously. As in Mechanical Turk , being "slightly more diverse than we used to be" doesn't get us any points, nor does it assure anyone that our labour isn't coming from countries with long-exploited workers.

Labor and Compensation

Rows and rows of books in a library, on vast curving shelves.

Photo CC-BY Samantha Marx.

I also want to draw parallels between the free labour of crowdsourcing and the free labour offered in civic hackathons or open-data contests. Specifically, I'd argue that open-data projects are less ( but still definitely ) abusive to their volunteers, because at least those volunteers have a portfolio object or other deliverable to show for their work. They often work in groups and get to network, whereas heritage crowdsourcers work in isolation.

There's also the potential for converting open-data projects to something monetizable: for example, a Toronto-specific bike-route app can easily be reconfigured for other cities and sold; while the Toronto version stays free under the terms of the civic initiative, freemium options can be added. The volunteers who supply thousands of transcriptions or tags can't usually download their own datasets and convert them into something portfolio-worthy, let alone sellable. Those data are useless without their digital objects, and those digital objects still belong to the museum or library.

Crowdsourcing and microtransactions are two halves of the same coin: they both mark new stages in the continuing devaluation of labour, and they both enable misuse and abuse of people who increasingly find themselves with few alternatives. If we're not offering these people jobs, reference letters, training, performance reviews, a "foot in the door" (cronyist as that is), or even acknowledgement by name, what impetus do they have to contribute? As with Wikipedia, I think the intrinsic motivation for many people to supply us with free labour is one of two things: either they love being right, or they've been convinced by the feel-good rhetoric that they're adding to the net good of the world. Of course, trained librarians, archivists, and museum workers have fallen sway to the conflation of labour and identity , too, but we expect to be paid for it.

As in tech, stereotypes and PR obfuscate labour in cultural heritage. For tech, an entrepreneurial spirit and a tendency to buck traditional thinking; for GLAMs, a passion for public service and opening up access to treasures ancient and modern. Of course, tech celebrates the autodidactic dropout; in GLAMs, you need a masters. Period. Maybe two. And entry-level jobs in GLAMs require one or more years of experience, across the board.

When library and archives students go into massive student debt, they're rarely apprised of the constant shortfall of funding for government-agency positions, nor do they get told how much work is done by volunteers (and, consequently, how much of the job is monitoring and babysitting said volunteers). And they're not trained with enough technological competency to sysadmin anything , let alone build a platform that pulls crowdsourced data into an authoritative record. The costs of commissioning these platforms aren't yet being made public, but I bet paying subject experts for their hourly labour would be cheaper.


I've tried my hand at many of the crowdsourcing and gamifying interfaces I'm here to critique. I've never been caught up in the "passion" ascribed to those super-volunteers who deliver huge amounts of work. But I can tally up other ways I contribute to this problem: I volunteer for scholarly tasks such as peer-reviewing, committee work, and travelling on my own dime to present. I did an unpaid internship without receiving class credit. I've put my research behind a paywall. I'm complicit in the established practices of the industry, which sits uneasily between academic and social work: neither of those spheres have ever been profit-generators, and have always used their codified altruism as ways to finagle more labour for less money.

It's easy to suggest that we outlaw crowdsourced volunteer work, and outlaw microtransactions on Fiverr and MTurk, just as the easy answer would be to outlaw Uber and Lyft for divorcing administration from labour standards. Ideally, we'd make it illegal for technology to wade between workers and fair compensation.

But that's not going to happen, so we need alternatives. Just as unpaid internships are being eliminated ad-hoc through corporate pledges, rather than being prohibited region-by-region, we need pledges from cultural-heritage institutions that they will pay for labour where possible, and offer concrete incentives to volunteer or intern otherwise. Budgets may be shrinking, but that's no reason not to compensate people at least through resume and portfolio entries. The best template we've got so far is the Society of American Archivists' volunteer best practices , which includes "adequate training and supervision" provisions, which I interpret to mean outlawing microtransactions entirely. The Citizen Science Alliance , similarly, insists on "concrete outcomes" for its crowdsourcing projects, to " never waste the time of volunteers ." It's vague, but it's something.

We can boycott and publicly shame those organizations that promote these projects as fun ways to volunteer, and lobby them to instead seek out subject experts for more significant collaboration. We've seen a few efforts to shame job-posters for unicorn requirements and pathetic salaries, but they've flagged without productive alternatives to blind rage.

There are plenty more band-aid solutions. Groups like Shatter The Ceiling offer cash to women of colour who take unpaid internships. GLAM-specific internship awards are relatively common , but could: be bigger, focus on diverse applicants who need extra support, and have eligibility requirements that don't exclude people who most need them (such as part-time students, who are often working full-time to put themselves through school). Better yet, we can build a tech platform that enables paid work, or at least meaningful volunteer projects. We need nationalized or non-profit recruiting systems (a digital "volunteer bureau") that matches subject experts with the institutions that need their help. One that doesn't take a cut from every transaction, or reinforce power imbalances, the way Uber does. GLAMs might even find ways to combine projects, so that one person's work can benefit multiple institutions.

GLAMs could use plenty of other help, too: feedback from UX designers on our catalogue interfaces, helpful tools , customization of our vendor platforms, even turning libraries into Tor relays or exits . The open-source community seems to be looking for ways to contribute meaningful volunteer labour to grateful non-profits; this would be a good start.

What's most important is that cultural heritage preserves the ostensible benefits of crowdsourcing – opening our collections and processes up for scrutiny, and admitting the limits of our knowledge – without the exploitative labour practices. Just like in tech, a few more glimpses behind the curtain wouldn't go astray. But it would require deeper cultural shifts, not least in the self-perceptions of GLAM workers: away from overprotective stewards of information, constantly threatened by dwindling budgets and unfamiliar technologies, and towards facilitators, participants in the communities whose histories we hold.

Tech Workers Please Stop Defending Tech Companies by Shanley Kane Model View Culture

[Nov 11, 2016] pdxdoughnut

Notable quotes:
"... Show the command line for a PID, converting nulls to spaces and a newline ..."
Nov 11, 2016 |

Commands by pdxdoughnut from the last day the last week the last month all time sorted by date votes

Terminal - Commands by pdxdoughnut - 26 results

whatinstalled() { which "$@" | xargs -r readlink -f | xargs -r dpkg -S ;}

2016-11-08 20:59:25

Tags: which grep dpkg function packages realpath

Comments (4)

command ${MYVAR:+--someoption=$MYVAR}

2015-11-04 19:47:24

Use a var with more text only if it exists. See "Parameter Expansion" in the bash manpage. They refer to this as "Use Alternate Value", but we're including the var in the at alternative.

Comments (2)

[ -n "$REMOTE_USER" ] || read -p "Remote User: " -er -i "$LOGNAME" REMOTE_USER

2015-10-30 17:08:17

If (and only if) the variable is not set, prompt users and give them a default option already filled in. The read command reads input and puts it into a variable. With -i you set an initial value. In this case I used a known environment variable.

Show sample output | Comments (2)

debsecan --format detail

2015-10-22 18:46:41

List known debian vulnerabilities on your system -- many of which may not yet be patched.

You can search for CVEs at or use --report to get full links. This can be added to cron, but unless you're going to do manual patches, you'd just be torturing yourself.

Comments (3)

tr '\0' ' ' </proc/21679/cmdline ; echo

2015-09-25 22:08:31

Show the command line for a PID, converting nulls to spaces and a newline

Comments (7)

mv -iv $FILENAME{,.$(stat -c %y $FILENAME | awk '{print $1}')}

2014-12-01 22:41:38

Rename file to same name plus datestamp of last modification.

Note that the -i will not help in a script. Proper error checking is required.

Show sample output | Comments (2)

echo "I am $BASH_SUBSHELL levels nested";

2014-06-20 20:33:43

Comments (0)

diff -qr /dirA /dirB

2014-04-01 21:42:19

shows which files differ in two direcories

Comments (0)

find ./ -type l -ls

2014-03-21 17:13:39

Show all symlinks

Comments (3)


xargs will automatically determine how namy args are too many and only pass a reasonable number of them at a time. In the example, 500,002 file names were split across 26 instantiations of the command "echo".

killall conky

kill a process(e.g. conky) by its name, useful when debugging conky:)

nmap -sn

2014-01-28 23:32:18

Ping all hosts on

find . -name "pattern" -type f -printf "%s\n" | awk '{total += $1} END {print total}'

2014-01-16 01:16:18

Find files and calculate size of result in shell. Using find's internal stat to get the file size is about 50 times faster than using -exec stat.

mkdir Epub ; mv -v --target-directory=Epub $(fgrep -lr epub *)

2014-01-16 01:07:29

Move all epub keyword containing files to Epub folder

Comments (0) | Add to favourites | Report as malicious | Submit alternative | Report as a duplicate

CMD=chrome ; ps h -o pmem -C $CMD | awk '{sum+=$1} END {print sum}'

Show total cumulative memory usage of a process that spawns multiple instances of itself

mussh -h 192.168.100.{1..50} -m -t 10 -c uptime

2013-11-27 18:01:12

This will run them at the same time and timeout for each host in ten seconds. Also, mussh will append the ip addres to the beginning of the output so you know which host resonded with which time.

The use of the sequence expression {1..50} is not specific to mussh. The `seq ...` works, but is less efficient.

Comments (1)

du -Sh | sort -h | tail

2013-11-27 17:50:11

Which files/dirs waste my disk space

I added -S to du so that you don't include /foo/bar/baz.iso in /foo, and change sorts -n to -h so that it can properly sort the human readable sizes.

base64 /dev/urandom | head -c 33554432 | split -b 8192 -da 4 - dummy.

2013-11-12 17:56:23

Create a bunch of dummy text files

Avoiding a for loop brought this time down to less than 3 seconds on my old machine. And just to be clear, 33554432 = 8192 * 4086.

Comments (6)

sudo lsof -iTCP:25 -sTCP:LISTEN

2013-11-12 17:32:34

Check if TCP port 25 is open

for i in {1..4096}; do base64 /dev/urandom | head -c 8192 > dummy$i.rnd ; done

2013-11-12 00:36:10

Create a bunch of dummy text files

Using the 'time' command, running this with 'tr' took 28 seconds (and change) each time but using base64 only took 8 seconds (and change). If the file doesn't have to be viewable, pulling straight from urandom with head only took 6 seconds (and change)

Comments (2)

find -name .git -prune -o -type f -exec md5sum {} \; | sort -k2 | md5sum

2013-10-28 22:14:08

Create md5sum of a directory

Comments (6)

logger -t MyProgramName "Whatever you're logging"

2013-10-22 16:34:49

creating you're logging function for your script. You could also pipe to logger.

find garbage/ -type f -delete

2013-10-21 23:26:51

rm filenames with spaces

I _think_ you were trying to delete files whether or not they had spaces. This would do that. You should probably be more specific though.

Comments (3)

lsof -iTCP:8080 -sTCP:LISTEN

2013-10-07 18:22:32

check to see what is running on a specific port number

find . -size +100M

2013-10-07 18:16:14

Recursivly search current directory for files larger than 100MB

Comments (3)

[Sep 18, 2016] R Weekly

Sep 18, 2016 |
September 12th, 2016 R Weekly

A new weekly publication of R resources that began on 21 May 2016 with Issue 0 .

Mostly titles of post and news articles, which is useful, but not as useful as short summaries, including the author's name.

[Nov 08, 2015] Abstraction

Filed in Programming No Comments

A few things can confuse programming students, or new people to programming. One of these is abstraction.

Wikipedia says:

In computer science, abstraction is the process by which data and programs are defined with a representation similar to its meaning (semantics), while hiding away the implementation details. Abstraction tries to reduce and factor out details so that the programmer can focus on a few concepts at a time. A system can have several abstraction layers whereby different meanings and amounts of detail are exposed to the programmer. For example, low-level abstraction layers expose details of the hardware where the program is run, while high-level layers deal with the business logic of the program.

That might be a bit too wordy for some people, and not at all clear. Here's my analogy of abstraction.

Abstraction is like a car

A car has a few features that makes it unique.

If someone can drive a Manual transmission car, they can drive any Manual transmission car. Automatic drivers, sadly, cannot drive a Manual transmission drivers without "relearing" the car. That is an aside, we'll assume that all cars are Manual transmission cars – as is the case in Ireland for most cars.

Since I can drive my car, which is a Mitsubishi Pajero, that means that I can drive your car – a Honda Civic, Toyota Yaris, Volkswagen Passat.

All I need to know, in order to drive a car – any car – is how to use the breaks, accelerator, steering wheel, clutch and transmission. Since I already know this in my car, I can abstract away your car and it's controls.

I do not need to know the inner workings of your car in order to drive it, just the controls. I don't need to know how exactly the breaks work in your car, only that they work. I don't need to know, that your car has a turbo charger, only that when I push the accelerator, the car moves. I also don't need to know the exact revs that I should gear up or gear down (although that would be better on the engine!)

Virtually all controls are the same. Standardization means that the clutch, break and accelerator are all in the same place, regardless of the car. This means that I do not need to relearn how a car works. To me, a car is just a car, and is interchangeable with any other car.

Abstraction means not caring

As a programmer, or someone using a third party API (for example), abstraction means not caring how the inner workings of some function works – Linked list data structure, variable names inside the function, the sorting algorithm used, etc – just that I have a standard (preferable unchanging) interface to do whatever I need to do.

Abstraction can be taught of as a black box. For input, you get output. That shouldn't be the case, but often is. We need abstraction so that, as a programmer, we can concentrate on other aspects of the program – this is the corner-stone for large scale, multi developer, software projects.

[Feb 05, 2015] JavaScript, PHP Top Most Popular Languages, With Apple's Swift Rising Fast

Its not about the language, its about the ecosystem

Feb 04, 2015 |

Posted by samzenpus
from the king-of-the-hill dept.

Nerval's Lobster writes Developers assume that Swift, Apple's newish programming language for iOS and Mac OS X apps, will become extremely popular over the next few years. According to new data from RedMonk, a tech-industry analyst firm, Swift could reach that apex of popularity sooner rather than later. While the usual stalwarts-including JavaScript, Java, PHP, Python, C#, C++, and Ruby-top RedMonk's list of the most-used languages, Swift has, well, swiftly ascended 46 spots in the six months since the firm's last update, from 68th to 22nd. RedMonk pulls data from GitHub and Stack Overflow to create its rankings, due to those sites' respective sizes and the public nature of their data. While its top-ranked languages don't trade positions much between reports, there's a fair amount of churn at the lower end of the rankings. Among those "smaller" languages, R has enjoyed stable popularity over the past six months, Rust and Julia continue to climb, and Go has exploded upwards-although CoffeeScript, often cited as a language to watch, has seen its support crumble a bit.

Dutch Gun (899105) on Wednesday February 04, 2015 @09:45PM (#48985989)

Re:not really the whole story (Score:5, Insightful)

More critically, the question I always ask about this is: "Used for what?"

Without that context, why does popularity even matter? For example, I'm a game developer, so my programming life revolves around C++, at least for game-side or engine-level code - period. Nothing else is even on the radar when you're talking about highly-optimized, AAA games. For scripting, Lua is a popular contender. For internal tools, C# seems to be quite popular. I've also seen Python used for tool extensions, or for smaller tools in their own right. Javascript is generally only used for web-based games, or by the web development teams for peripheral stuff.

I'll bet everyone in their own particular industry has their own languages which are dominant. For instance, if you're working on the Linux kernel, you're obviously working in C. It doesn't matter what the hell everyone else does. If you're working in scientific computing, are you really looking seriously at Swift? Of course not. Fortran, F#, or C++ are probably more appropriate, or perhaps others I'm not aware of. A new lightweight iOS app? Swift it is!

Languages are not all equal. The popularity of Javascript is not the measure of merit of that particular language. It's a measure of how popular web-based development is (mostly). C/C++ is largely a measure of how many native, high-performance-required applications there are (games, OS development, large native applications). Etc, etc.

Raw popularity numbers probably only have one practical use, and that's finding a programming job without concern for the particular industry. Or I suppose if you're so emotionally invested in a particular language, it's nice to know where it stands among them all.

unrtst (777550) on Wednesday February 04, 2015 @10:34PM (#48986283)

... And not sure public github or stack overflow are really as representative as they want to believe

Yeah.. why is this any better than:

TIOBE index: []

This story about python surpassing java as top learning language: []

Or this about 5 languages you'll need to learn for the next year and on: []

... those are all from the past year on slashdot, and there's loads more.

Next "top languages" post I see, I hope it just combines all the other existing stats to provide a weightable index (allow you to tweak what's most important). Maybe BH can address that :-)

gavron (1300111) on Wednesday February 04, 2015 @08:21PM (#48985495)

68th to 22nd and there are many to go (Score:5, Insightful)

All new languages start out at the bottom, as Swift did. In time, the ones that don't get used fall down.

Swift has gotten up to 22nd, but the rest of the climb past the stragglers won't ever happen.

However, to be "the most popular language" is clearly no contest worth winning. Paris Hilton and Kim Kardashian are most popular compared to Steven Hawking and Isaac Asimov.

Being popular doesn't mean better, useful, or even of any value whatsoever. It just means someone has a better marketing-of-crap department.

There's a time to have popularity contests. It's called high school.


coop247 (974899) on Wednesday February 04, 2015 @08:54PM (#48985695)

Being popular doesn't mean better, useful, or even of any value whatsoever

PHP runs facebook, yahoo, wordpress, and wikipedia. Javascript runs everything on the internet. Yup, no value there.

UnknownSoldier (67820) on Wednesday February 04, 2015 @08:39PM (#48985617)

Popularity != Quality (Score:5, Insightful)

McDonalds may serve billions, but no one is trying to pass it off as gourmet food.

Kind of like PHP and Javascript. The most fucked up languages are the most popular ... Go figure.

* []

Xest (935314) on Thursday February 05, 2015 @04:47AM (#48987365)

Re:Popularity != Quality (Score:2)

I think this is the SO distortion effect.

Effectively the more warts a language has and/or the more poorly documented it is, the more questions that are bound to be asked about it, hence the more apparent popularity if you use SO as a metric.

So if companies like Microsoft and Oracle produce masses of great documentation for their respective technologies and provide entire sites of resources for them (such as or the MSDN developer forums) then they'll inherently see reduced "popularity" on SO.

Similarly some languages have a higher bar to entry, PHP and Javascript are both repeatedly sold as languages that beginners can start with, it should similarly be unsurprising therefore that more questions are asked about them than by people who have moved up the change to enterprise languages like C#, Java, and C++.

But I shouldn't complain too much, SO popularity whilst still blatantly flawed is still a far better metric than TIOBE whose methodology is just outright broken (they explain their methodology on their site, and even without high school statistics knowledge it shouldn't take more than 5 seconds to spot gaping holes in their methodology).

I'm still amazed no one's done an actual useful study on popularity and simply scraped data from job sites each month. It'd be nice to know what companies are actually asking for, and what they're paying. That is after all the only thing anyone really wants to know when they talk about popularity - how likely is it to get me a job, and how well is it likely to pay? Popularity doesn't matter beyond that as you just choose the best tool for the job regardless of how popular it is.

Shados (741919) on Wednesday February 04, 2015 @11:08PM (#48986457)

Re:Just learn C and Scala (Score:2)

Its not about the language, its about the ecosystem. ie: .NET may be somewhere in between java and scala, and the basic of the framework is the same, but if you do high end stuff, JVM languages and CLR languages are totally different. Different in how you debug it in production, different in what the standards are, different in what patterns people expect you to use when they build a library, different gotchas. And while you can pick up the basics in an afternoon, it can take years to really push it.

Doesn't fucking matter if you're doing yet-another-e-commerce-site (and if you are, why?). Really fucking big deal if you do something that's never been done before with a ridiculous amount of users.

[Oct 18, 2013] Tom Clancy, Best-Selling Master of Military Thrillers, Dies at 66

Fully applicable to programming...

"I tell them you learn to write the same way you learn to play golf," he once said. "You do it, and keep doing it until you get it right. A lot of people think something mystical happens to you, that maybe the muse kisses you on the ear. But writing isn't divinely inspired - it's hard work."

[Oct 12, 2013] YaST Developers Explain Move to Ruby by Susan Linton

Oct. 10, 2013 | OStatic

Last summer Lukas Ocilka mentioned the completion of the basic conversion of YaST from YCP to Ruby. At the time it was said the change was needed to encourage contributions from a wider set of developers, and Ruby is said to be simpler and more flexible. Well, today Jos Poortvliet posted an interview with two YaST developers explaining the move in more detail.

In a discussion with Josef Reidinger and David Majda, Poortvliet discovered the reason for the move was because all the original YCP developers had moved on to other things and everyone else felt YCP slowed them down. "It didn't support many useful concepts like OOP or exception handling, code written in it was hard to test, there were some annoying features (like a tendency to be "robust", which really means hiding errors)."

Ruby was chosen because it is a well known language over at the openSUSE camp and was already being used on other SUSE projects (such as WebYaST). "The internal knowledge and standardization was the decisive factor." The translation went smoothly according to developers because they "automated the whole process and did testing builds months in advance. We even did our custom builds of openSUSE 13. 1 Milestones 2 and 3 with pre-release versions of YaST in Ruby."

For now performance under the Ruby code is comparable to the YCP version because developers were concentrating on getting it working well during this first few phases and user will notice very little if any visual changes to the YaST interface. No more major changes are planned for this development cycle, but the new Yast will be used in 13.1 due out November 19.

See the full interview for lots more detail.

[Oct 19, 2012] Google's Engineers Are Well Paid, Not Just Well Fed

October 18, 2012 | Slashdot

Anonymous Coward


writes: on Thursday , @12:38PM (#41694241)

I make more than $40k as a software developer, but it wasn't too long ago that I was making right around that amount.

I have an AAS (not a fancy degree, if you didn't already know), my GPA was 2.8, and I assure you that neither of those things has EVER come up in a job interview. I'm also old enough that my transcripts are gone. (Schools only keep them for about 10 years. After that, nobody's looking anyway.)

The factors that kept me from making more are:

So when I did finally land a programming job, it was as a code monkey in a PHP sweatshop. The headhunter wanted a decent payout, so I started at $40k. No raises. Got laid off after a year and a half due to it being a sweatshop and I had outstayed my welcome. (Basically, I wanted more money and they didn't want to give me any more money.)

Next job was a startup. Still $40k. Over 2.5 years, I got a couple of small raises. I topped out at $45k-ish before I got laid off during the early days of the recession.

Next job was through a headhunter again. I asked for $50k, but the employer could only go $40k. After 3 years and a few raises, I'm finally at $50k.

I could probably go to the larger employers in this city and make $70k, but that's really the limit in this area. Nobody in this line of work makes more than about $80k here.


Not accurate, smaller companies pay more

This survey must be only talking about companies above certain size. Our Sillicon Valley startup has about 50 employees and the average engineering salaries are north of $150,000. Large companies like Google actually don't have to pay that much, because the hours are more reasonable. I know there are other companies too that pay more than Google in the area.

Reply to This Parent Share twitter facebook Flag as Inappropriate

Re:Not accurate, smaller companies pay more (Score:4, Interesting)
by MisterSquid (231834) writes: on Thursday October 18, @11:16AM (#41693121)
Our Sillicon Valley startup has about 50 employees and the average engineering salaries are north of $150,000.

I suppose there are some start-ups that do pay developers the value of the labor, but my own experience is a bit different in that it was more stereotypical of Silicon-Valley startup compensation packages. That is, my salary was shamefully low (I was new to the profession), just about unlivable for the Bay Area, and was offset with a very accelerated stock options plan.

[Mar 14, 2012] Perl vs Python Why the debate is meaningless " The ByteBaker

Arvind Padmanabhan:

I don't know Python but I can comment on Perl. I have written many elegant scripts for complex problems and I still love it. I often come across comments about how a programmer went back to his program six months later and had difficulty understanding it. For my part, I haven't had this problem primarily because I use consistently a single syntax. If perl provides more than one way to do things, I choose and use only one. Secondly, I do agree that objects in perl are clunky and make for difficult writing/reading. I have never used them. This makes it difficult for me to write perl scripts for large projects. Perhaps this is where Python succeeds.

shane o mac

I was forced to learn Python in order to write scripts within Blender (Open Source 3D Modeler).

White Hat:
1. dir( object )
This is nice as it shows functions and constants.

Black Hat:
1. Indention to denote code blocks. (!caca?)

PERL was more of an experiment instead of necessity. Much of what I know about regular expressions probably came from reading about PERL. I never even wrote much code in PERL. You see CPAN (Repository for modules) alone makes up for all the drawbacks I can't think of at the moment.

White Hat:
FreeBSD use to extensively use PERL for installation routines (4.7., I keep a copy of it in my music case although I don't know why as I feel its a good luck charm of sorts). Then I read in 5.0 they started removing it in favor of shell script (BAsh). Why?

Black Hat:
I'm drawing a blank here.

With freedom there are costs, you are allowed to do as you please. Place variables as you must and toss code a-muck.

You can discipline yourself to write great code in any language. VB-Script I write has the appearance of a standard practices C application.

I think it's a waste of time to sit and debate over which language is suited for which project. Pick one and master it.

So you can't read PERL? Break the modules into several files. It's more about information management than artisan ablility. Divide and Conquor.


I disagree with mastering one language. Often there will be trade-offs in a program that match very nicely with a particular program.

For example, you could code everything in C++\C\assembler, but that only makes sense when you really need speed or memory compactness. After all, I find it difficult to write basic file processing applications in C in under 10 minutes.

Perl examples use a lot of default variables and various ways to approach problems, but this is really a nightmare when you have to maintain someone else's code. Especially if you don't know Perl. I think its hard to understand (without a background in Perl)


I'm a perl guy not a py programmer so I won't detract from python [except for the braces, Guido should at least let the language compile with them].

Note: Perl is like English it's a reflective language. So I can make nouns into adjectives and use the power of reflection. For example … 'The book on my living room table' vs [Spanish] 'The book on the table of my living room'.

And this makes sense … because Larry Wall was a linguist; and was very influenced by the fact that reflective languages can say more with less because much is implied based on usage. These languages can also say the same thing many different ways. [Perl makes me pull my hair out. | Perl makes me pull out my hair.] And being that we have chromosomes that wire us for human language … these difficulties are soon mastered by even children. But … but we don't have the same affinity for programming languages (well most of us) so yes Perl can be a struggle in the beginning. But once you achieve a strong familiarity and stop trying to turn Perl into C or Python and allow Perl just to be Perl you really really start to enjoy it for those reasons you didn't like it before.

The biggest failure of Perl has been its users enjoying the higher end values of the language and failing to publish and document simple examples to help non-monks get there. You shouldn't have to be a monk to seek wisdom at the monastic gates.

Example … Perl classes. Obtuse and hard to understand you say? It doesn't have to be … I think that most programmers will understand and be able to write their own after just looking at this simple example. Keep in mind, we just use 'package' instead or class. Bless tells the interpreter your intentions and is explicitly used because you can bless all kinds of things … including a class (package).

my $calc = new Calc; # or Calc->new;
print $calc->Add(1);
print $calc->Add(9);
print $calc->Pie(67);

package Calc;
sub new
my $class = shift; # inherits from package name, 'Calc'
my $self =
_undue => undef,
_currentVal => 0,
_pie => 3.14
bless $self, $class; # now we have a class named 'Calc'
return $self;

sub Add
my ($self, $val) = @_;

$self->{_undue} = $self->{_currentVal}; # save off the last value

$self->{_currentVal} = $self->{_currentVal} + $val; # add the scalars
return $self->{_currentVal}; # return the new value

sub Pie
my ($self, $val) = @_;

$self->{_undue} = $self->{_currentVal}; # save off the last value

$self->{_currentVal} = $self->{_pie} * $val; # add the scalars
return $self->{_currentVal}; # return the new value

[Mar 14, 2012] How To Contribute To Open Source Without Being a Programming Rock Star

Esther Schindler writes "Plenty of people want to get involved in open source, but don't know where to start. In this article, Andy Lester lists several ways to help out even if you lack confidence in your technical chops. Here are a couple of his suggestions: 'Maintenance of code and the systems surrounding the code often are neglected in the rush to create new features and to fix bugs. Look to these areas as an easy way to get your foot into a project. Most projects have a publicly visible trouble ticket system, linked from the front page of the project's website and included in the documentation. It's the primary conduit of communication between the users and the developers. Keeping it current is a great way to help the project. You may need to get special permissions in the ticketing system, which most project leaders will be glad to give you when you say you want to help clean up the tickets.'" What's your favorite low-profile way to contribute?

[Dec 25, 2010] [PDF] Brian Kernighan - Random thoughts on scripting languages

I would expect better, higher level of thinking from Brian...

Other scripting languages

[Dec 20, 2010] What hurts you the most in Perl


Steve Carrobis

Perl is a far better applications type language than JAVA/C/C#. Each has their niche. Threads were always an issue in Perl, and like OO, if you don't need it or know it don't use it.

My issues with Perl is when people get Overly Obfuscated with their code because the person thinks that less characters and a few pointers makes the code faster.

Unless you do some real smart OOesque building all you are doing is making it harder to figure out what you were thinking about. and please perl programmers, don't by into the "self documenting code" i am an old mainframer and self documenting code was as you wrote you added comments to the core parts of the code ... i can call my subroutine "apple" to describe it.. but is it really an apple? or is it a tomato or pomegranate. If written properly Perl is very efficient code. and like all the other languages if written incorrectly its' HORRIBLE. I have been writing perl since almost before 3.0 ;-)

Thats my 3 cents.. Have a HAPPY and a MERRY!

Nikolai Bezroukov

@steve Thanks for a valuable comment about the threat of overcomplexity junkies in Perl. That's a very important threat that can undermine the language future.

@Gabor: A well know fact is that PHP, which is a horrible language both as for general design and implementation of most features you mentioned is very successful and is widely used on for large Web applications with database backend (Mediawiki is one example). Also if we think about all dull, stupid and unrelaible Java coding of large business applications that we see on the marketplace the question arise whether we want this type of success ;-)

@Douglas: Mastering Perl requires slightly higher level of qualification from developers then "Basic-style" development in PHP or commercial Java development (where Java typically plays the role of Cobol) which is mainstream those days. Also many important factors are outside technical domain: ecosystem for Java is tremendous and is supported by players with deep pockets. Same is true for Python. Still Perl has unique advantages, is universally deployed on Unix and as such is and always will be attractive for thinking developers

I think that for many large business applications which in those days often means Web application with database backend one can use virtual appliance model and use OS facilities for multitasking. Nothing wrong with this approach on modern hardware. Here Perl provides important advantages due to good integration with Unix.

Also structuring of a large application into modules using pipes and sockets as communication mechanism often provides very good maintainability. Pointers are also very helpful and unique for Perl. Typically scripting languages do not provide pointers. Perl does and as such gives the developer unique power and flexibility (with additional risks as an inevitable side effect).

Another important advantage of Perl is that it is a higher level language then Python (to say nothing about Java ) and stimulates usage of prototyping which is tremendously important for large projects as the initial specification is usually incomplete and incorrect. Also despite proliferation of overcomplexity junkies in Perl community, some aspects of Perl prevent excessive number of layers/classes, a common trap that undermines large projects in Java. Look at IBM fiasco with Lotus Notes 8.5.

I think that Perl is great in a way it integrates with Unix and promote thinking of complex applications as virtual appliances. BTW this approach also permits usage of a second language for those parts of the system for which Perl does not present clear advantages.

Also Perl provide an important bridge to system administrators who often know the language and can use subset of it productively. That makes it preferable for large systems which depend on customization such as monitoring systems.

Absence of bytecode compiler hurts development of commercial applications in Perl in more ways than one but that's just question of money. I wonder why ActiveState missed this opportunity to increase its revenue stream. I also agree that the quality of many CPAN modules can be improved but abuse of CPAN along with fixation on OO is a typical trait of overcomplexity junkies so this has some positive aspect too :-).

I don't think that OO is a problem for Perl, if you use it where it belongs: in GUI interfaces. In many cases OO is used when hierarchical namespaces are sufficient. Perl provides a clean implementation of the concept of namespaces. The problem is that many people are trained in Java/C++ style of OO and as we know for hummer everything looks like a nail. ;-)

Allan Bowhill:

I think the original question Gabor posed implies there is a problem 'selling' Perl to companies for large projects. Maybe it's a question of narrowing its role.

It seems to me that if you want an angle to sell Perl on, it would make sense to cast it (in a marketing sense) into a narrower role that doesn't pretend to be everything to everyone. Because, despite what some hard-core Perl programmers might say, the language is somewhat dated. It hasn't really changed all that much since the 1990s.

Perl isn't particularly specialized so it has been used historically for almost every kind of application imaginable. Since it was (for a long time in the dot-com era) a mainstay of IT development (remember the 'duct tape' of the internet?) it gained high status among people who were developing new systems in short time-frames. This may in fact be one of the problems in selling it to people nowadays.

The FreeBSD OS even included Perl as part of their main (full) distribution for some time and if I remember correctly, Perl scripts were included to manage the ports/packaging system for all the 3rd party software. It was taken out of the OS shortly after the bust and committee reorganization at FreeBSD, where it was moved into third-party software. The package-management scripts were re-written in C. Other package management utilities were effectively displaced by a Ruby package.

A lot of technologies have come along since the 90s which are more appealing platforms than Perl for web development, which is mainly what it's about now.

If you are going to build modern web sites these days, you'll more than likely use some framework that utilizes object-oriented languages. I suppose the Moose augmentation of Perl would have some appeal with that, but CPAN modules and addons like Moose are not REALLY the Perl language itself. So if we are talking about selling the Perl language alone to potential adopters, you have to be honest in discussing the merits of the language itself without all the extras.

Along those lines I could see Perl having special appeal being cast in a narrower role, as a kind of advanced systems batching language - more capable and effective than say, NT scripting/batch files or UNIX shell scripts, but less suitable than object-oriented languages, which pretty much own the market for web and console utilities development now.

But there is a substantial role for high-level batching languages, particularly in systems that build data for consumption by other systems. These are traditionally implemented in the highest-level batching language possible. Such systems build things like help files, structured (non-relational) databases (often used on high-volume commercial web services), and software. Not to mention automation many systems administration tasks.

There are not too many features or advantages to Perl that are unique in itself in the realm of scripting languages, as they were in the 90s. The simplicity of built-in Perl data structures and regular expression capabilities are reflected almost identically in Ruby, and are at least accessible in other strongly-typed languages like Java and C#.

The fact that Perl is easy to learn, and holds consistent with the idea that "everything is a string" and there is no need to formalize things into an object-oriented model are a few of its selling points. If it is cast as an advanced batching language, there are almost no other languages that could compete with it in that role.

Dean Hamstead:

@Pascal: bytecode is nasty for the poor Sysadmin/Devop who has to run your code. She/he can never fix it when bugs arise. There is no advantage to bytecode over interpreted.

Which infact leads me to a good point.

All the 'selling points' of Java have all failed to be of any real substance.

In truth, Java is popular because it is popular.

Lots of people dont like perl because its not popular any more. Similar to how lots of people hate Mac's but have no logical reason for doing so.

Douglas is almost certainly right, that Python is rapidly becoming the new fad language.

Im not sure how perl OO is a 'hack'. When you bless a reference in to an object it becomes and object... I can see that some people are confused by perls honesty about what an object is. Other languages attempt to hide away how they have implemented objects in their compiler - who cares? Ultimately the objects are all converted in to machine code and executed.

In general perl objects are more object oriented than java objects. They are certainly more polymorphic.

Perl objects can fully hide their internals if thats something you want to do. Its not even hard, and you dont need to use moose. But does it afford any real benefit? Not really.

At the end of the day, if you want good software you need to hire good programmers it has nothing to do with the language. Even though some languages try to force the code to be neat (Python) and try to force certain behaviours (Java?) you can write complete garbage in any of them, then curse that language for allowing the author to do so.

A syntactic argument is pointless. As is something oriented around OO. What benefits perl brings to a business are...

- massive centralised website of libraries (CPAN)
- MVC's
- Other frameworks etc
- automated code review (perlcritic)
- automated code formatting and tidying (perltidy)
- document as you code (POD)
- natural test driven development (Test::More etc)
- platform independence
- perl environments on more platforms than java
- perl comes out of the box on every unix
- excellent canon of printed literature, from beginner to expert
- common language with Sysadmin/Devops and traditional developers roles (with source code always available to *fix* them problem quickly, not have to try to set up an ant environment with and role a new War file)
- rolled up perl applications (PAR files)
- Perl can use more than 3.6gig of ram (try that in java)

Brian Martin

Well said Dean.

Personally, I don't really care if a system is written in Perl or Python or some other high level language, I don't get religious about which high level language is used.

There are many [very] high level languages, any one of them is vastly more productive & consequently less buggy than developing in a low level language like C or Java. Believe me, I have written more vanilla C code in my career than Perl or Python, by a factor of thousands, yet I still prefer Python or Perl as quite simply a more succinct expression of the intended algorithm.

If anyone wants to argue the meaning of "high level", well basically APL wins ok. In APL, to invert a matrix is a single operator. If you've never had to implement a matrix inversion from scratch, then you've never done serious programming. Meanwhile, Python or Perl are pretty convenient.

What I mean by a "[very] high level language" is basically how many pages of code does it take to play a decent game of draughts (chequers), or chess ?

[Nov 18, 2010] The Law of Software Development By James Kwak

November 17, 2010 | The Baseline Scenario

I recently read a frightening 2008 post by David Pogue about the breakdown of homemade DVDs. This inspired me to back up my old DVDs of my dog to my computer (now that hard drives are so much bigger than they used to be), which led me to install HandBrake. The Handbrake web site includes this gem:

"The Law of Software Development and Envelopment at MIT:

Every program in development at MIT expands until it can read mail."
I thought of that when I heard that Facebook is launching a (beyond) email service.

(The side benefit of this project is that now I get to watch videos of my dog sleeping whenever I want to.)


Pogue should have mentioned whether he was talking about DVD-R or DVD-RW.

The rewritable variants are vastly more perishable than the write-once variants.


That law is originally due to Jamie Zawinski (rather famous software developer known for his work on Netscape Navigator and contributions to Mozilla and XEmacs). In its original form:

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.-Jamie Zawinski

Ted K

James, you remind me of myself when I'm drinking and thinking of past things. I'm mean, I'm not criticizing you, please don't take it as criticism. But watching videos of the dog that passed away…Don't torture yourself man. Wait some time and when you and the family is ready, get a new dog. Should probably be a different breed unless you're super-sold on that breed. Let the kids choose one from a set of breeds you like.

We got Pogue's book on the i-Pod because Apple's manual is so crappy. He is "da man".

You know if Apple gave a damn about customers they would include a charger cord with that damned thing to hook into the wall instead of making you shop for the charger cord separately, but Mr. "I'm an assh*le" Steve Jobs couldn't bother to show he's customer service inclined. He's slowly but surely going the way of Bill Gates.

Ted K

Mr. Kwak,
There is a super good story they're running on PBS NewsHour today with the former "NOW" host David Brancaccio on a show called "Fixing the Future". James you need to try to download that or catch the show. That looks really good and shows some promise for the future. Catch this link people. Nov 18 (Thursday)

David Petraitis

@Ted K
It looks like programs need to expand until they can delete uploaded Youtube videos which are seriously off topic.

As for James' original point, most applications today are Internet Aware and use the Internet in their base functionality (which is what was meant by the original email capability) The next level is for them to be mobile and location aware, and it is already happening.

Bruce E. Woych

facebook launches beyond e-mail…

I was browsing through some tech books at B&N's and came across some works assessing the questionable fields of cyber space tech wars and the current trends in development. The book has no axe to grind and was written well before the facebook attempt to dominate the media with multimedia dependency. Here's what was so interesting in the text that applies:

The two greatest "vulnerabilities of the future" involve what is categorized as consolidation and convergence.

Now it first occurred to me that this is much like micro and macro economics…but then I realized that it is precisely (in the field) like too big to fail!

So are we on another monopoly trail down the primrose path of self destructive dependencies?

Isn't this just another brand media Octopus looking to knock out variations and dominate our choices with their market offerings? And is this going to set us up for I.T. crisis of authorization for the systemic network and future of "ownership" wars in essential services?


Facebook is recreating AOL. A gigantic walled garden that becomes "the internet" for most of the people with computers. Look how AOL ended up.

And Handbrake is a great little program. I've been using it to rip my DVD collection to the 2TB of network storage I now have on my home network. A very convenient way to watch movies.


Shrub already handed them out to all his war + torture buddies, as well as Greenspan – and Daddy Shrub gave one to the teabaggers' favorite faux-economist (Hayek) and to Darth Cheney, so I'd say the reputation of the medal is pretty much already in the sewer.

[Aug 25, 2010] Sometimes the Old Ways Are Best by Brian Kernighan

IEEE Software Nov/Dec 2008, pp.18-19

As I write this column, I'm in the middle of two summer projects; with luck, they'll both be finished by the time you read it. One involves a forensic analysis of over 100,000 lines of old C and assembly code from about 1990, and I have to work on Windows XP. The other is a hack to translate code written in weird language L1 into weird language L2 with a program written in scripting language L3, where none of the L's even existed in 1990; this one uses Linux. Thus it's perhaps a bit surprising that I find myself relying on much the same toolset for these very different tasks.

... ... ...

here has surely been much progress in tools over the 25 years that IEEE Software has been around, and I wouldn't want to go back in time. But the tools I use today are mostly the same old ones-grep, diff, sort, awk, and friends.

This might well mean that I'm a dinosaur stuck in the past. On the other hand, when it comes to doing simple things quickly, I can often have the job done while experts are still waiting for their IDE to start up. Sometimes the old ways are best, and they're certainly worth knowing well

[Dec 22, 2008] 13 reasons why Ruby, Python and the gang will push Java to die… of old age

Very questionable logic. The cost of programming and especially the cost of maintenance of an application depends on the level of the language. It is less with Ruby/Python then with Java.
Lately I seem to find everywhere lots of articles about the imminent dismissal of Java and its replacement with the scripting language of the day or sometimes with other compiled languages.

No, that is not gonna happen. Java is gonna die eventually of old age many many years from now. I will share the reasoning behind my statement. Let's first look at some metrics.

Language popularity status as of May 2008

For this I am gonna use the TIOBE index ( and the nice graphs at I know lots of people don't like them because their statistics are based on search engine results but I think they are a reasonable fair indicator of popularity.

Facts from the TIOBE index:

TIOBE Index Top 20

What I find significant here is the huge share the "C like syntax" languages have.

C (15.292) + C++ (10.484) + Java (20.176) + C# (3.963) = 49.915%

This means 4 languages get half of all the attention on the web.

If we add PHP (10.637) here (somehow uses a similar syntax) we get 60.552%

As a result we can extract:

Reason number 1: Syntax is very important because it builds on previous knowledge. Also similar syntax means similar concepts. Programmers have to make less effort to learn the new syntax, can reuse the old concepts and thus they can concentrate on understanding the new concepts.

Let's look at a group of 10 challengers:

He forgot to add Perl

Python (4.613) + Ruby (2.851) + Lisp/Scheme (0.449) + Lua (0.393) + SmallTalk (0.138) +
Haskell (0.137) + Groovy (0.131) + Erlang (0.110) + Caml (0.090) + Scala (0.073) = 8.985%

This is less than the attention Visual Basic gets: 10.782% and leads us to…

TIOBE Index Top 21-50

Reason number 2: Too much noise is distracting. Programmers are busy and learning 10 languages to the level where they can evaluate them and make an educated decision is too much effort. The fact that most of these languages have a different syntax and introduce different (sometimes radically different) concepts doesn't help either.

Looking at the trend for the last 7 years we can see a pretty flat evolution in popularity for most of the languages. There are a few exceptions like the decline of Perl but nothing really is earth shattering. There are seasonal variations but in long term nothing seems to change.


This shows that while various languages catch the mind of the programmer for a short time, they are put back on the shelf pretty fast. This might be caused by the lack of opportunity to use them in real life projects. Most of the programmers in the world work on ongoing projects.

Reason number 3: Lack of pressure on the programmers to switch. The market is pretty stable, the existing languages work pretty well and the management doesn't push programmers to learn new languages.

Number of new projects started

Looking at another site that does language popularity analysis,, we see a slightly different view but the end result is almost the same from the point of view of challenger languages.

What I found interesting here was the analysis regarding new projects started in various languages. The sources for information are and Google Code. The results show a clear preference for C/C++/Java with Python getting some attention.

Reason number 4: Challenger languages don't seem to catch momentum in order to create an avalanche of new projects started with them. This can be again due to the fact that they spread thin when they are evaluated. They are too many.

Other interesting charts at are those about books on programming languages at and about language discussions statistics. Book writers write about subjects that have a chance to sell. On the other hand a lot of discussion about all theses new languages takes place online. One thing I noticed in these discussion is the attitude the supporters of certain languages have. There is a lot of elitism and concentration on what is wrong with Java instead of pointing to what their language of choice brings useful and on creating good tutorials for people wanting to attempt a switch.

Reason number 5: Challenger languages communities don't do a good job at attracting programmers from established languages. Telling to somebody why she is wrong will most likely create a counter reaction not interest.

Let's look now at what is happening on the job market. I used the tools offered by and I compared a bunch of languages to produce this graph:

Java, C, C++, C#, Python, Ruby, PHP, Scala Job Trends graph

Reason number 6: There is no great incentive to switch to one of the challenger languages since gaining this skill is not likely to translate into income in the near future.

Well, I looked at all these statistics and I extracted some reasons, but what are the qualities a language needs and what are the external conditions that will make a programming language popular?

How and when does a language become popular

So we can draw more reasons:

For one's curiosity here is a list of talked about languages with their birth date:
Ruby (mid 1990s), Python (1991), Lisp (1958), Scheme (1970s), Lua (1993), Smalltalk (1969-1980), Haskell (1990), Erlang (1987), Caml (1985), OCaml (1996), Groovy (2003), Scala (2003)

Compare this with older successful languages:
C (1972), C++ (1983), Java (1995), C# (2001), BASIC (1964), Pascal (1970), FORTRAN (1957), Ada (1983), COBOL (1959)

It is pretty obvious most of these "new" languages lost the train to success.

Why many of the new languages will never be popular

Reason number 11: "Features" that look and are dangerous for big projects. Since there are not a lot of big projects written in any of these languages it is hard to make an unbiased evaluation. But bias is in the end a real obstacle for their adoption.

Reason number 12: Unnatural concepts (for majority of programmers) raise the entry level. Functional languages make you write code like mathematical equations. But how many people actually love math so much to write everything in it? Object oriented languages provide a great advantage: they let programmers think about the domain they want to model, not about the language or the machine.

Reason number 13: Lack of advanced tools for development and refactoring cripple the programmer and the development teams when faced with big amounts of lines of code.

How would a real Java challenger look like

Pick me, pick mee, pick meeee!!!

Looking at all those (smart) languages and all the heated discussions that surround them makes me think about the donkey from Shrek yelling "Pick me! Pick mee!! Pick meeee!!!". In the end only one can be the real winner even if in a limited part of the market.

The danger for Java doesn't come from outside. None of these new (actually most of them are pretty old) languages have the potential to displace Java. The danger for Java comes from inside and it is caused by too many "features" making their way into the language and transforming if from a language that wanted to keep only the essential features of C++ into a trash box for features and concepts from all languages.

In the end I want to make it clear that I am not advocating against any of those languages. There is TAO in all of them. I actually find them interesting, cool and useful as exercise for my brain, when I have time. I recommend to every programmer to look around from time to time and try to understand what is going on the language market.

This article is part of a series of opinions and rants:

[Dec 12, 2008] The A-Z of Programming Languages Perl

What new elements does Perl 5.10.0 bring to the language? In what way is it preparing for Perl 6?

Perl 5.10.0 involves backporting some ideas from Perl 6, like switch statements and named pattern matches.

One of the most popular things is the use of "say" instead of "print".

This is an explicit programming design in Perl - easy things should be easy and hard things should be possible. It's optimised for the common case. Similar things should look similar but similar things should also look different, and how you trade those things off is an interesting design principle.

Huffman Coding is one of those principles that makes similar things look different.

In your opinion, what lasting legacy has Perl brought to computer development?

An increased awareness of the interplay between technology and culture. Ruby has borrowed a few ideas from Perl and so has PHP. I don't think PHP understands the use of signals, but all languages borrow from other languages, otherwise they risk being single-purpose languages. Competition is good.

It's interesting to see PHP follow along with the same mistakes Perl made over time and recover from them. But Perl 6 also borrows back from other languages too, like Ruby. My ego may be big, but it's not that big.

Where do you envisage Perl's future lying?

My vision of Perl's future is that I hope I don't recognize it in 20 years.

Where do you see computer programming languages heading in the future, particularly in the next 5 to 20 years?

Don't design everything you will need in the next 100 years, but design the ability to create things we will need in 20 or 100 years. The heart of the Perl 6 effort is the extensibility we have built into the parser and introduced language changes as non-destructively as possible.

Linux Today's comments

> Given the horrible mess that is Perl (and, BTW, I derive 90% of my income from programming in Perl),
Did the thought that 'horrible mess' you produce with $language 'for an income' could be YOUR horrible mess already cross your mind? The language itself doesn't write any code.

> You just said something against his beloved
> Perl and compounded your heinous crime by
> saying something nice about his
> narrow view you are the antithesis of all that is
> right in the world. He will respond with his many
> years of Perl == good and everything else == bad
> but just let it go...
That's a pretty pointless insult. Languages don't write code. People do. A statement like 'I think that code written in Perl looks very ugly because of the large amount of non-alphanumeric characters' would make sense. Trying to elevate entirely subjective, aesthetic preferences into 'general principles' doesn't. 'a mess' is something inherently chaotic, hence, this is not a sensible description for a regularly structured program of any kind. It is obviously possible to write (or not write) regularly structured programs in any language providing the necessary abstractions for that. This set includes Perl.
I had the mispleasure to have to deal with messes created by people both in Perl and Python (and a couple of other languages) in the past. You've probably heard the saying that "real programmers can write FORTRAN in any language" already.

It is even true that the most horrible code mess I have seen so far had been written in Perl. But this just means that a fairly chaotic person happened to use this particular programming language.

[Dec 9, 2008] What Programming Language For Linux Development


by dkf (304284) <> on Saturday December 06, @07:08PM (#26016101) Homepage

C/C++ are the languages you'd want to go for. They can do *everything*, have great support, are fast etc.

Let's be honest here. C and C++ are very fast indeed if you use them well (very little can touch them; most other languages are actually implemented in terms of them) but they're also very easy to use really badly. They're genuine professional power tools: they'll do what you ask them to really quickly, even if that is just to spin on the spot chopping peoples' legs off. Care required!

If you use a higher-level language (I prefer Tcl, but you might prefer Python, Perl, Ruby, Lua, Rexx, awk, bash, etc. - the list is huge) then you probably won't go as fast. But unless you're very good at C/C++ you'll go acceptably fast at a much earlier calendar date. It's just easier for most people to be productive in higher-level languages. Well, unless you're doing something where you have to be incredibly close to the metal like a device driver, but even then it's best to keep the amount of low-level code small and to try to get to use high-level things as soon as you can.

One technique that is used quite a bit, especially by really experienced developers, is to split the program up into components that are then glued together. You can then write the components in a low-level language if necessary, but use the far superior gluing capabilities of a high-level language effectively. I know many people are very productive doing this.

[Apr 25, 2008] Interview with Donald Knuth by Donald E. Knuth & Andrew Binstock

Andrew Binstock and Donald Knuth converse on the success of open source, the problem with multicore architecture, the disappointing lack of interest in literate programming, the menace of reusable code, and that urban legend about winning a programming contest with a single compilation.

Andrew Binstock: You are one of the fathers of the open-source revolution, even if you aren't widely heralded as such. You previously have stated that you released TeX as open source because of the problem of proprietary implementations at the time, and to invite corrections to the code-both of which are key drivers for open-source projects today. Have you been surprised by the success of open source since that time?

Donald Knuth:

The success of open source code is perhaps the only thing in the computer field that hasn't surprised me during the past several decades. But it still hasn't reached its full potential; I believe that open-source programs will begin to be completely dominant as the economy moves more and more from products towards services, and as more and more volunteers arise to improve the code.

For example, open-source code can produce thousands of binaries, tuned perfectly to the configurations of individual users, whereas commercial software usually will exist in only a few versions. A generic binary executable file must include things like inefficient "sync" instructions that are totally inappropriate for many installations; such wastage goes away when the source code is highly configurable. This should be a huge win for open source.

Yet I think that a few programs, such as Adobe Photoshop, will always be superior to competitors like the Gimp-for some reason, I really don't know why! I'm quite willing to pay good money for really good software, if I believe that it has been produced by the best programmers.

Remember, though, that my opinion on economic questions is highly suspect, since I'm just an educator and scientist. I understand almost nothing about the marketplace.

Andrew: A story states that you once entered a programming contest at Stanford (I believe) and you submitted the winning entry, which worked correctly after a single compilation. Is this story true? In that vein, today's developers frequently build programs writing small code increments followed by immediate compilation and the creation and running of unit tests. What are your thoughts on this approach to software development?


The story you heard is typical of legends that are based on only a small kernel of truth. Here's what actually happened: John McCarthy decided in 1971 to have a Memorial Day Programming Race. All of the contestants except me worked at his AI Lab up in the hills above Stanford, using the WAITS time-sharing system; I was down on the main campus, where the only computer available to me was a mainframe for which I had to punch cards and submit them for processing in batch mode. I used Wirth's ALGOL W system (the predecessor of Pascal). My program didn't work the first time, but fortunately I could use Ed Satterthwaite's excellent offline debugging system for ALGOL W, so I needed only two runs. Meanwhile, the folks using WAITS couldn't get enough machine cycles because their machine was so overloaded. (I think that the second-place finisher, using that "modern" approach, came in about an hour after I had submitted the winning entry with old-fangled methods.) It wasn't a fair contest.

As to your real question, the idea of immediate compilation and "unit tests" appeals to me only rarely, when I'm feeling my way in a totally unknown environment and need feedback about what works and what doesn't. Otherwise, lots of time is wasted on activities that I simply never need to perform or even think about. Nothing needs to be "mocked up."

Andrew: One of the emerging problems for developers, especially client-side developers, is changing their thinking to write programs in terms of threads. This concern, driven by the advent of inexpensive multicore PCs, surely will require that many algorithms be recast for multithreading, or at least to be thread-safe. So far, much of the work you've published for Volume 4 of The Art of Computer Programming (TAOCP) doesn't seem to touch on this dimension. Do you expect to enter into problems of concurrency and parallel programming in upcoming work, especially since it would seem to be a natural fit with the combinatorial topics you're currently working on?


The field of combinatorial algorithms is so vast that I'll be lucky to pack its sequential aspects into three or four physical volumes, and I don't think the sequential methods are ever going to be unimportant. Conversely, the half-life of parallel techniques is very short, because hardware changes rapidly and each new machine needs a somewhat different approach. So I decided long ago to stick to what I know best. Other people understand parallel machines much better than I do; programmers should listen to them, not me, for guidance on how to deal with simultaneity.

Andrew: Vendors of multicore processors have expressed frustration at the difficulty of moving developers to this model. As a former professor, what thoughts do you have on this transition and how to make it happen? Is it a question of proper tools, such as better native support for concurrency in languages, or of execution frameworks? Or are there other solutions?


I don't want to duck your question entirely. I might as well flame a bit about my personal unhappiness with the current trend toward multicore architecture. To me, it looks more or less like the hardware designers have run out of ideas, and that they're trying to pass the blame for the future demise of Moore's Law to the software writers by giving us machines that work faster only on a few key benchmarks! I won't be surprised at all if the whole multithreading idea turns out to be a flop, worse than the "Titanium" approach that was supposed to be so terrific - until it turned out that the wished-for compilers were basically impossible to write.

Let me put it this way: During the past 50 years, I've written well over a thousand programs, many of which have substantial size. I can't think of even five of those programs that would have been enhanced noticeably by parallelism or multithreading. Surely, for example, multiple processors are no help to TeX.[1]

How many programmers do you know who are enthusiastic about these promised machines of the future? I hear almost nothing but grief from software people, although the hardware folks in our department assure me that I'm wrong.

I know that important applications for parallelism exist-rendering graphics, breaking codes, scanning images, simulating physical and biological processes, etc. But all these applications require dedicated code and special-purpose techniques, which will need to be changed substantially every few years.

Even if I knew enough about such methods to write about them in TAOCP, my time would be largely wasted, because soon there would be little reason for anybody to read those parts. (Similarly, when I prepare the third edition of Volume 3 I plan to rip out much of the material about how to sort on magnetic tapes. That stuff was once one of the hottest topics in the whole software field, but now it largely wastes paper when the book is printed.)

The machine I use today has dual processors. I get to use them both only when I'm running two independent jobs at the same time; that's nice, but it happens only a few minutes every week. If I had four processors, or eight, or more, I still wouldn't be any better off, considering the kind of work I do-even though I'm using my computer almost every day during most of the day. So why should I be so happy about the future that hardware vendors promise? They think a magic bullet will come along to make multicores speed up my kind of work; I think it's a pipe dream. (No-that's the wrong metaphor! "Pipelines" actually work for me, but threads don't. Maybe the word I want is "bubble.")

From the opposite point of view, I do grant that web browsing probably will get better with multicores. I've been talking about my technical work, however, not recreation. I also admit that I haven't got many bright ideas about what I wish hardware designers would provide instead of multicores, now that they've begun to hit a wall with respect to sequential computation. (But my MMIX design contains several ideas that would substantially improve the current performance of the kinds of programs that concern me most-at the cost of incompatibility with legacy x86 programs.)

Andrew: One of the few projects of yours that hasn't been embraced by a widespread community is literate programming. What are your thoughts about why literate programming didn't catch on? And is there anything you'd have done differently in retrospect regarding literate programming?


Literate programming is a very personal thing. I think it's terrific, but that might well be because I'm a very strange person. It has tens of thousands of fans, but not millions.

In my experience, software created with literate programming has turned out to be significantly better than software developed in more traditional ways. Yet ordinary software is usually okay-I'd give it a grade of C (or maybe C++), but not F; hence, the traditional methods stay with us. Since they're understood by a vast community of programmers, most people have no big incentive to change, just as I'm not motivated to learn Esperanto even though it might be preferable to English and German and French and Russian (if everybody switched).

Jon Bentley probably hit the nail on the head when he once was asked why literate programming hasn't taken the whole world by storm. He observed that a small percentage of the world's population is good at programming, and a small percentage is good at writing; apparently I am asking everybody to be in both subsets.

Yet to me, literate programming is certainly the most important thing that came out of the TeX project. Not only has it enabled me to write and maintain programs faster and more reliably than ever before, and been one of my greatest sources of joy since the 1980s-it has actually been indispensable at times. Some of my major programs, such as the MMIX meta-simulator, could not have been written with any other methodology that I've ever heard of. The complexity was simply too daunting for my limited brain to handle; without literate programming, the whole enterprise would have flopped miserably.

If people do discover nice ways to use the newfangled multithreaded machines, I would expect the discovery to come from people who routinely use literate programming. Literate programming is what you need to rise above the ordinary level of achievement. But I don't believe in forcing ideas on anybody. If literate programming isn't your style, please forget it and do what you like. If nobody likes it but me, let it die.

On a positive note, I've been pleased to discover that the conventions of CWEB are already standard equipment within preinstalled software such as Makefiles, when I get off-the-shelf Linux these days.

Andrew: In Fascicle 1 of Volume 1, you reintroduced the MMIX computer, which is the 64-bit upgrade to the venerable MIX machine comp-sci students have come to know over many years. You previously described MMIX in great detail in MMIXware. I've read portions of both books, but can't tell whether the Fascicle updates or changes anything that appeared in MMIXware, or whether it's a pure synopsis. Could you clarify?


Volume 1 Fascicle 1 is a programmer's introduction, which includes instructive exercises and such things. The MMIXware book is a detailed reference manual, somewhat terse and dry, plus a bunch of literate programs that describe prototype software for people to build upon. Both books define the same computer (once the errata to MMIXware are incorporated from my website). For most readers of TAOCP, the first fascicle contains everything about MMIX that they'll ever need or want to know.

I should point out, however, that MMIX isn't a single machine; it's an architecture with almost unlimited varieties of implementations, depending on different choices of functional units, different pipeline configurations, different approaches to multiple-instruction-issue, different ways to do branch prediction, different cache sizes, different strategies for cache replacement, different bus speeds, etc. Some instructions and/or registers can be emulated with software on "cheaper" versions of the hardware. And so on. It's a test bed, all simulatable with my meta-simulator, even though advanced versions would be impossible to build effectively until another five years go by (and then we could ask for even further advances just by advancing the meta-simulator specs another notch).

Suppose you want to know if five separate multiplier units and/or three-way instruction issuing would speed up a given MMIX program. Or maybe the instruction and/or data cache could be made larger or smaller or more associative. Just fire up the meta-simulator and see what happens.

Andrew: As I suspect you don't use unit testing with MMIXAL, could you step me through how you go about making sure that your code works correctly under a wide variety of conditions and inputs? If you have a specific work routine around verification, could you describe it?


Most examples of machine language code in TAOCP appear in Volumes 1-3; by the time we get to Volume 4, such low-level detail is largely unnecessary and we can work safely at a higher level of abstraction. Thus, I've needed to write only a dozen or so MMIX programs while preparing the opening parts of Volume 4, and they're all pretty much toy programs-nothing substantial. For little things like that, I just use informal verification methods, based on the theory that I've written up for the book, together with the MMIXAL assembler and MMIX simulator that are readily available on the Net (and described in full detail in the MMIXware book).

That simulator includes debugging features like the ones I found so useful in Ed Satterthwaite's system for ALGOL W, mentioned earlier. I always feel quite confident after checking a program with those tools.

Andrew: Despite its formulation many years ago, TeX is still thriving, primarily as the foundation for LaTeX. While TeX has been effectively frozen at your request, are there features that you would want to change or add to it, if you had the time and bandwidth? If so, what are the major items you add/change?


I believe changes to TeX would cause much more harm than good. Other people who want other features are creating their own systems, and I've always encouraged further development-except that nobody should give their program the same name as mine. I want to take permanent responsibility for TeX and Metafont, and for all the nitty-gritty things that affect existing documents that rely on my work, such as the precise dimensions of characters in the Computer Modern fonts.

Andrew: One of the little-discussed aspects of software development is how to do design work on software in a completely new domain. You were faced with this issue when you undertook TeX: No prior art was available to you as source code, and it was a domain in which you weren't an expert. How did you approach the design, and how long did it take before you were comfortable entering into the coding portion?


That's another good question! I've discussed the answer in great detail in Chapter 10 of my book Literate Programming, together with Chapters 1 and 2 of my book Digital Typography. I think that anybody who is really interested in this topic will enjoy reading those chapters. (See also Digital Typography Chapters 24 and 25 for the complete first and second drafts of my initial design of TeX in 1977.)

Andrew: The books on TeX and the program itself show a clear concern for limiting memory usage-an important problem for systems of that era. Today, the concern for memory usage in programs has more to do with cache sizes. As someone who has designed a processor in software, the issues of cache-aware and cache-oblivious algorithms surely must have crossed your radar screen. Is the role of processor caches on algorithm design something that you expect to cover, even if indirectly, in your upcoming work?


I mentioned earlier that MMIX provides a test bed for many varieties of cache. And it's a software-implemented machine, so we can perform experiments that will be repeatable even a hundred years from now. Certainly the next editions of Volumes 1-3 will discuss the behavior of various basic algorithms with respect to different cache parameters.

In Volume 4 so far, I count about a dozen references to cache memory and cache-friendly approaches (not to mention a "memo cache," which is a different but related idea in software).

Andrew: What set of tools do you use today for writing TAOCP? Do you use TeX? LaTeX? CWEB? Word processor? And what do you use for the coding?


My general working style is to write everything first with pencil and paper, sitting beside a big wastebasket. Then I use Emacs to enter the text into my machine, using the conventions of TeX. I use tex, dvips, and gv to see the results, which appear on my screen almost instantaneously these days. I check my math with Mathematica.

I program every algorithm that's discussed (so that I can thoroughly understand it) using CWEB, which works splendidly with the GDB debugger. I make the illustrations with MetaPost (or, in rare cases, on a Mac with Adobe Photoshop or Illustrator). I have some homemade tools, like my own spell-checker for TeX and CWEB within Emacs. I designed my own bitmap font for use with Emacs, because I hate the way the ASCII apostrophe and the left open quote have morphed into independent symbols that no longer match each other visually. I have special Emacs modes to help me classify all the tens of thousands of papers and notes in my files, and special Emacs keyboard shortcuts that make bookwriting a little bit like playing an organ. I prefer rxvt to xterm for terminal input. Since last December, I've been using a file backup system called backupfs, which meets my need beautifully to archive the daily state of every file.

According to the current directories on my machine, I've written 68 different CWEB programs so far this year. There were about 100 in 2007, 90 in 2006, 100 in 2005, 90 in 2004, etc. Furthermore, CWEB has an extremely convenient "change file" mechanism, with which I can rapidly create multiple versions and variations on a theme; so far in 2008 I've made 73 variations on those 68 themes. (Some of the variations are quite short, only a few bytes; others are 5KB or more. Some of the CWEB programs are quite substantial, like the 55-page BDD package that I completed in January.) Thus, you can see how important literate programming is in my life.

I currently use Ubuntu Linux, on a standalone laptop-it has no Internet connection. I occasionally carry flash memory drives between this machine and the Macs that I use for network surfing and graphics; but I trust my family jewels only to Linux. Incidentally, with Linux I much prefer the keyboard focus that I can get with classic FVWM to the GNOME and KDE environments that other people seem to like better. To each his own.

Andrew: You state in the preface of Fascicle 0 of Volume 4 of TAOCP that Volume 4 surely will comprise three volumes and possibly more. It's clear from the text that you're really enjoying writing on this topic. Given that, what is your confidence in the note posted on the TAOCP website that Volume 5 will see light of day by 2015?


If you check the Wayback Machine for previous incarnations of that web page, you will see that the number 2015 has not been constant.

You're certainly correct that I'm having a ball writing up this material, because I keep running into fascinating facts that simply can't be left out-even though more than half of my notes don't make the final cut.

Precise time estimates are impossible, because I can't tell until getting deep into each section how much of the stuff in my files is going to be really fundamental and how much of it is going to be irrelevant to my book or too advanced. A lot of the recent literature is academic one-upmanship of limited interest to me; authors these days often introduce arcane methods that outperform the simpler techniques only when the problem size exceeds the number of protons in the universe. Such algorithms could never be important in a real computer application. I read hundreds of such papers to see if they might contain nuggets for programmers, but most of them wind up getting short shrift.

From a scheduling standpoint, all I know at present is that I must someday digest a huge amount of material that I've been collecting and filing for 45 years. I gain important time by working in batch mode: I don't read a paper in depth until I can deal with dozens of others on the same topic during the same week. When I finally am ready to read what has been collected about a topic, I might find out that I can zoom ahead because most of it is eminently forgettable for my purposes. On the other hand, I might discover that it's fundamental and deserves weeks of study; then I'd have to edit my website and push that number 2015 closer to infinity.

Andrew: In late 2006, you were diagnosed with prostate cancer. How is your health today?


Naturally, the cancer will be a serious concern. I have superb doctors. At the moment I feel as healthy as ever, modulo being 70 years old. Words flow freely as I write TAOCP and as I write the literate programs that precede drafts of TAOCP. I wake up in the morning with ideas that please me, and some of those ideas actually please me also later in the day when I've entered them into my computer.

On the other hand, I willingly put myself in God's hands with respect to how much more I'll be able to do before cancer or heart disease or senility or whatever strikes. If I should unexpectedly die tomorrow, I'll have no reason to complain, because my life has been incredibly blessed. Conversely, as long as I'm able to write about computer science, I intend to do my best to organize and expound upon the tens of thousands of technical papers that I've collected and made notes on since 1962.

Andrew: On your website, you mention that the Peoples Archive recently made a series of videos in which you reflect on your past life. In segment 93, "Advice to Young People," you advise that people shouldn't do something simply because it's trendy. As we know all too well, software development is as subject to fads as any other discipline. Can you give some examples that are currently in vogue, which developers shouldn't adopt simply because they're currently popular or because that's the way they're currently done? Would you care to identify important examples of this outside of software development?


Hmm. That question is almost contradictory, because I'm basically advising young people to listen to themselves rather than to others, and I'm one of the others. Almost every biography of every person whom you would like to emulate will say that he or she did many things against the "conventional wisdom" of the day.

Still, I hate to duck your questions even though I also hate to offend other people's sensibilities-given that software methodology has always been akin to religion. With the caveat that there's no reason anybody should care about the opinions of a computer scientist/mathematician like me regarding software development, let me just say that almost everything I've ever heard associated with the term "extreme programming" sounds like exactly the wrong way to go...with one exception. The exception is the idea of working in teams and reading each other's code. That idea is crucial, and it might even mask out all the terrible aspects of extreme programming that alarm me.

I also must confess to a strong bias against the fashion for reusable code. To me, "re-editable code" is much, much better than an untouchable black box or toolkit. I could go on and on about this. If you're totally convinced that reusable code is wonderful, I probably won't be able to sway you anyway, but you'll never convince me that reusable code isn't mostly a menace.

Here's a question that you may well have meant to ask: Why is the new book called Volume 4 Fascicle 0, instead of Volume 4 Fascicle 1? The answer is that computer programmers will understand that I wasn't ready to begin writing Volume 4 of TAOCP at its true beginning point, because we know that the initialization of a program can't be written until the program itself takes shape. So I started in 2005 with Volume 4 Fascicle 2, after which came Fascicles 3 and 4. (Think of Star Wars, which began with Episode 4.)

[Apr 23, 2008] Dr. Dobb's Programming Languages Everyone Has a Favorite One by Deirdre Blake

I think the article misstates position of Perl (according to TIOBE index it is No.6 about above C#, Python and Ruby). The author definitely does not understand the value and staying power of C. Also there is an inherent problem with their methodology (as with any web presence based metric). This is visible in positions of C# which now definitely looks stronger then Python and Perl (and may be even PHP) as well as in positions of bash, awk and PowerShell. That means all other statements should be taken with the grain of salt...
April 23, 2008 | DDJ

From what Paul Jansen has seen, everyone has a favorite programming language.

Paul Jansen is managing director of TIOBE Software ( He can be contacted at

DDJ: Paul, can you tell us about the TIOBE Programming Community Index?

PJ: The TIOBE index tries to measure the popularity of programming languages by monitoring their web presence. The most popular search engines Google, Yahoo!, Microsoft, and YouTube are used to calculate these figures. YouTube has been added recently as an experiment (and only counts for 4 percent of the total). Since the TIOBE index has been published now for more than 6 years, it gives an interesting picture about trends in the area of programming languages. I started the index because I was curious to know whether my programming skills were still up to date and to know for which programming languages our company should create development tools. It is amazing to see that programming languages are something very personal. Every day we receive e-mails from people that are unhappy with the position of "their" specific language in the index. I am also a bit overwhelmed about the vast and constant traffic this index generates.

DDJ: Which language has moved to the top of the heap, so to speak, in terms of popularity, and why do you think this is the case?

PJ: If we take a look at the top 10 programming languages, not much has happened the last five years. Only Python entered the top 10, replacing COBOL. This comes as a surprise because the IT world is moving so fast that in most areas, the market is usually completely changed in five years time. Python managed to reach the top 10 because it is the truly object-oriented successor of Perl. Other winners of the last couple of years are Visual Basic, Ruby, JavaScript, C#, and D (a successor of C++). I expect in five years time there will be two main languages: Java and C#, closely followed by good-old Visual Basic. There is no new paradigm foreseen.

DDJ: Which languages seem to be losing ground?

PJ: C and C++ are definitely losing ground. There is a simple explanation for this. Languages without automated garbage collection are getting out of fashion. The chance of running into all kinds of memory problems is gradually outweighing the performance penalty you have to pay for garbage collection. Another language that has had its day is Perl. It was once the standard language for every system administrator and build manager, but now everyone has been waiting on a new major release for more than seven years. That is considered far too long.

DDJ: On the flip side, what other languages seem to be moving into the limelight?

PJ: It is interesting to observe that dynamically typed object-oriented (scripting) languages are evolving the most. A new language has hardly arrived on the scene, only to be immediately replaced by another new emerging language. I think this has to do with the increase in web programming. The web programming area demands a language that is easy to learn, powerful, and secure. New languages pop up every day, trying to be leaner and meaner than their predecessors. A couple of years ago, Ruby was rediscovered (thanks to Rails). Recently, Lua was the hype, but now other scripting languages such as ActionScript, Groovy, and Factor are about to claim a top 20 position. There is quite some talk on the Internet about the NBL (next big language). But although those web-programming languages generate a lot of attention, there is never a real breakthrough.

DDJ: What are the benefits of introducing coding standards into an organization? And how does an organization choose a standard that is a "right fit" for its development goals?

PJ: Coding standards help to improve the general quality of software. A good coding standard focuses on best programming practices (avoiding known language pitfalls), not only on style and naming conventions. Every language has its constructions that are perfectly legitimate according to its language definition but will lead to reliability, security, or maintainability problems. Coding standards help engineers to stick to a subset of a programming language to make sure that these problems do not occur. The advantage of introducing coding standards as a means to improve quality is that-once it is in place-it does not change too often. This is in contrast with dynamic testing. Every change in your program calls for a change in your dynamic tests. In short, dynamic tests are far more labor intensive than coding standards. On the other hand, coding standards can only take care of nonfunctional defects. Bugs concerning incorrectly implemented requirements remain undetected. The best way to start with coding standards is to download a code checker and tweak it to your needs. It is our experience that if you do not check the rules of your coding standard automatically, the coding standard will soon end as a dusty document on some shelf.

[Apr 18, 2008] CoScripter

A useful Firefox plug-in

CoScripter is a system for recording, automating, and sharing processes performed in a web browser such as printing photos online, requesting a vacation hold for postal mail, or checking flight arrival times. Instructions for processes are recorded and stored in easy-to-read text here on the CoScripter web site, so anyone can make use of them. If you are having trouble with a web-based process, check to see if someone has written a CoScript for it!

[Mar 12, 2008] Tiny Eclipse by Eric Suen

About: Tiny Eclipse is distribution of Eclipse for development with dynamic languages for the Web, such as JSP, PHP, Ruby, TCL, and Web Services. It features a small download size, the ability to choose the features you want to install, and GUI installers for Win32 and Linux GTK x86.

[Jan 2, 2008] Java is becoming the new Cobol by Bill Snyder

"Simply put, developers are saying that Java slows them down"
Dec 28, 2007 |

Simply put, developers are saying that Java slows them down. "There were big promises that Java would solve incompatibility problems [across platforms]. But now there are different versions and different downloads, creating complications," says Peter Thoneny, CEO of, which produces a certified version of the open source Twiki wiki-platform software. "It has not gotten easier. It's more complicated," concurs Ofer Ronen, CEO of Sendori, which routes domain traffic to online advertisers and ad networks. Sendori has moved to Ruby on Rails. Ronen says Ruby offers pre-built structures - say, a shopping cart for an e-commerce site - that you'd have to code from the ground up using Java.

Another area of weakness is the development of mobile applications. Java's UI capabilities and its memory footprint simply don't measure up, says Samir Shah, CEO of software testing provider Zephyr. No wonder the mobile edition of Java has all but disappeared, and no wonder Google is creating its own version (Android).

These weaknesses are having a real effect. Late last month, Info-Tech Research Group said its survey of 1,850 businesses found .Net the choice over Java among businesses of all sizes and industries, thanks to its promotion via Visual Studio and SharePoint. Microsoft is driving uptake of the .Net platform at the expense of Java," says George Goodall, a senior research analyst at Info-Tech.

One bit of good news: developers and analysts agree that Java is alive and well for internally developed enterprise apps. "On the back end, there is still a substantial amount of infrastructure available that makes Java a very strong contender," says Zephyr's Shah.

The Bottom Line: Now that Java is no longer the unchallenged champ for Internet-delivered apps, it makes sense for companies to find programmers who are skilled in the new languages. If you're a Java developer, now's the time to invest in new skills.

[Jan 2, 2008] Java is Becoming the New Cobol

I think that the author of this comment is deeply mistaken: the length of the code has tremendous influence of the cost of maintenance and number of errors and here Java sucks.
Linux Today

It's true. putting together an Enterprise-scale Java application takes a considerable amount of planning, design, and co-ordination. Scripted languages like Python are easier - just hack something out and you've a working webapp by the end of the day.

But then you get called in at midnight, because a lot of the extra front-end work in Java has to do with the fact that the compiler is doing major datatype validation. You're a lot less likely to have something blow up after it went into production, since a whole raft of potential screw-ups get caught at build time.

Scripting systems like Python, Perl, PHP, etc. not only have late binding, but frequently have late compiling as well, so until the offending code is invoked, it's merely a coiled-up snake.

In fact, after many years and many languages, I'm just about convinced that the amount of time and effort for producing a debugged major app in just about any high-level language is about the same.

Myself, I prefer an environment that keeps me from having to wear a pager. For those who need less sleep and more Instant Gratification, they're welcome to embrace the other end of the spectrum.

[Dec 28, 2007] My Programming Language History by Keith Waclena

It's pretty strange for the system admin and CGI programmer prefer Python to Perl... It goes without saying that in any case such evaluations should be taken with a grain of sslt. what makes this comparison interesting is that author claim to have substantial programming experience in Perl-4, Tcl and Python

Before going any further, read the disclaimer!

Some languages I've used and how I've felt about them. This may help you figure out where I'm coming from. I'm only listing the highlights here, and not including the exotica (Trac, SAM76, Setl, Rec, Convert, J...) and languages I've only toyed with or programmed in my head (Algol 68, BCPL, APL, S-Algol, Pop-2 / Pop-11, Refal, Prolog...).
My first language. Very nearly turned me off programming before I got started. I hated it. Still do.
The first language I loved, Sail was Algol 60 with zillions of extensions, from dynamically allocated strings to Leap, a weird production system / logic programming / database language that I never understood (and now can barely recall), and access to every Twenex JSYS and TOPS 10 UUO! Pretty much limited to PDP-10s; supposedly reincarnated as MAINSAIL, but I never saw that.
The first language I actually became fluent in; also the first language I ever got paid to program in.
The first language I actually wrote halfway-decent sizable code in, developed a personal subroutine library for, wrote multi-platform code in, and used on an IBM mainframe (Spitbol -- but I did all my development under Twenex with Sitbol, thank goodness). I loved Snobol: I used to dream in it.
The first language I ever thought was great at first and then grew to loathe. Subset G only, but that was enough.
The first language I ever ran on my own computer; also the first language I ever wrote useful assembler in -- serial I/O routines for the Z/80 (my first assembly language was a handful of toy programs in IBM 360 assembly language, using the aptly-named SPASM assembler), and the first language I thought was really wonderful but had a really difficult time writing useful programs in. Also the first language whose implementation I actually understood, the first language that really taught me about hardware, and the first language implementation I installed myself. Oh and the first language I taught.
What can I say here? It's unpleasant, but it works, it's everywhere, and you have to use it.
The first language I thought was truly brilliant and still think so to this day. I programmed in Maclisp and Muddle (MDL) at first, on the PDP-10; Franz and then Common Lisp later (but not much).
How could you improve on Lisp? Scheme is how.
The first language I wrote at least hundreds of useful programs in (Perl 4 (and earlier) only). Probably the second language I thought was great and grew to loathe (for many of the same reasons I grew to loathe PL/I, interestingly enough -- but it took longer).
Lazy Functional Languages.
How could you improve on Scheme? Lazy functional languages is how, but can you actually do anything with them (except compile lazy functional languages, of course)?
My previous standard, daily language. It's got a lot of problems, and it's amazing that Tcl programs ever get around to terminating, but they do, and astonishingly quickly (given the execution model...). I've developed a large library of Tcl procs that allow me to whip up substantial programs really quickly, the mark of a decent language. And it's willing to dress up as Lisp to fulfill my kinky desires.
My current standard, daily language. Faster than Tcl, about as fast as Perl and with nearly as large a standard library, but with a reasonable syntax and real data structures. It's by no means perfect -- still kind of slow, not enough of an expression language to suit me, dynamically typed, no macro system -- but I'm really glad I found it.

[Oct 5, 2007] Turn Vim into a bash IDE By Joe 'Zonker' Brockmeier

June 11, 2007 |

By itself, Vim is one of the best editors for shell scripting. With a little tweaking, however, you can turn Vim into a full-fledged IDE for writing scripts. You could do it yourself, or you can just install Fritz Mehner's Bash Support plugin.

To install Bash Support, download the zip archive, copy it to your ~/.vim directory, and unzip the archive. You'll also want to edit your ~/.vimrc to include a few personal details; open the file and add these three lines:

let g:BASH_AuthorName   = 'Your Name'
let g:BASH_Email        = ''
let g:BASH_Company      = 'Company Name'

These variables will be used to fill in some headers for your projects, as we'll see below.

The Bash Support plugin works in the Vim GUI (gVim) and text mode Vim. It's a little easier to use in the GUI, and Bash Support doesn't implement most of its menu functions in Vim's text mode, so you might want to stick with gVim when scripting.

When Bash Support is installed, gVim will include a new menu, appropriately titled Bash. This puts all of the Bash Support functions right at your fingertips (or mouse button, if you prefer). Let's walk through some of the features, and see how Bash Support can make Bash scripting a breeze.

Header and comments

If you believe in using extensive comments in your scripts, and I hope you are, you'll really enjoy using Bash Support. Bash Support provides a number of functions that make it easy to add comments to your bash scripts and programs automatically or with just a mouse click or a few keystrokes.

When you start a non-trivial script that will be used and maintained by others, it's a good idea to include a header with basic information -- the name of the script, usage, description, notes, author information, copyright, and any other info that might be useful to the next person who has to maintain the script. Bash Support makes it a breeze to provide this information. Go to Bash -> Comments -> File Header, and gVim will insert a header like this in your script:

#          FILE:
#         USAGE:  ./
#       OPTIONS:  ---
#          BUGS:  ---
#         NOTES:  ---
#        AUTHOR:  Joe Brockmeier,
#       COMPANY:  Dissociated Press
#       VERSION:  1.0
#       CREATED:  05/25/2007 10:31:01 PM MDT
#      REVISION:  ---

You'll need to fill in some of the information, but Bash Support grabs the author, company name, and email address from your ~/.vimrc, and fills in the file name and created date automatically. To make life even easier, if you start Vim or gVim with a new file that ends with an .sh extension, it will insert the header automatically.

As you're writing your script, you might want to add comment blocks for your functions as well. To do this, go to Bash -> Comment -> Function Description to insert a block of text like this:

#===  FUNCTION  ================================================================
#          NAME:
#       RETURNS:

Just fill in the relevant information and carry on coding.

The Comment menu allows you to insert other types of comments, insert the current date and time, and turn selected code into a comment, and vice versa.

Statements and snippets

Let's say you want to add an if-else statement to your script. You could type out the statement, or you could just use Bash Support's handy selection of pre-made statements. Go to Bash -> Statements and you'll see a long list of pre-made statements that you can just plug in and fill in the blanks. For instance, if you want to add a while statement, you can go to Bash -> Statements -> while, and you'll get the following:

while _; do

The cursor will be positioned where the underscore (_) is above. All you need to do is add the test statement and the actual code you want to run in the while statement. Sure, it'd be nice if Bash Support could do all that too, but there's only so far an IDE can help you.

However, you can help yourself. When you do a lot of bash scripting, you might have functions or code snippets that you reuse in new scripts. Bash Support allows you to add your snippets and functions by highlighting the code you want to save, then going to Bash -> Statements -> write code snippet. When you want to grab a piece of prewritten code, go to Bash -> Statements -> read code snippet. Bash Support ships with a few included code fragments.

Another way to add snippets to the statement collection is to just place a text file with the snippet under the ~/.vim/bash-support/codesnippets directory.

Running and debugging scripts

Once you have a script ready to go, and it's testing and debugging time. You could exit Vim, make the script executable, run it and see if it has any bugs, and then go back to Vim to edit it, but that's tedious. Bash Support lets you stay in Vim while doing your testing.

When you're ready to make the script executable, just choose Bash -> Run -> make script executable. To save and run the script, press Ctrl-F9, or go to Bash -> Run -> save + run script.

Bash Support also lets you call the bash debugger (bashdb) directly from within Vim. On Ubuntu, it's not installed by default, but that's easily remedied with apt-get install bashdb. Once it's installed, you can debug the script you're working on with F9 or Bash -> Run -> start debugger.

If you want a "hard copy" -- a PostScript printout -- of your script, you can generate one by going to Bash -> Run -> hardcopy to This is where Bash Support comes in handy for any type of file, not just bash scripts. You can use this function within any file to generate a PostScript printout.

Bash Support has several other functions to help run and test scripts from within Vim. One useful feature is syntax checking, which you can access with Alt-F9. If you have no syntax errors, you'll get a quick OK. If there are problems, you'll see a small window at the bottom of the Vim screen with a list of syntax errors. From that window you can highlight the error and press Enter, and you'll be taken to the line with the error.

Put away the reference book...

Don't you hate it when you need to include a regular expression or a test in a script, but can't quite remember the syntax? That's no problem when you're using Bash Support, because you have Regex and Tests menus with all you'll need. For example, if you need to verify that a file exists and is owned by the correct user ID (UID), go to Bash -> Tests -> file exists and is owned by the effective UID. Bash Support will insert the appropriate test ([ -O _]) with your cursor in the spot where you have to fill in the file name.

To build regular expressions quickly, go to the Bash menu, select Regex, then pick the appropriate expression from the list. It's fairly useful when you can't remember exactly how to express "zero or one" or other regular expressions.

Bash Support also includes menus for environment variables, bash builtins, shell options, and a lot more.

Hotkey support

Vim users can access many of Bash Support's features using hotkeys. While not as simple as clicking the menu, the hotkeys do follow a logical scheme that makes them easy to remember. For example, all of the comment functions are accessed with \c, so if you want to insert a file header, you use \ch; if you want a date inserted, type \cd; and for a line end comment, use \cl.

Statements can be accessed with \a. Use \ac for a case statement, \aie for an "if then else" statement, \af for a "for in..." statement, and so on. Note that the online docs are incorrect here, and indicate that statements begin with \s, but Bash Support ships with a PDF reference card (under .vim/bash-support/doc/bash-hot-keys.pdf) that gets it right.

Run commands are accessed with \r. For example, to save the file and run a script, use \rr; to make a script executable, use \re; and to start the debugger, type \rd. I won't try to detail all of the shortcuts, but you can pull up a reference using :help bashsupport-usage-vim when in Vim, or use the PDF. The full Bash Support reference is available within Vim by running :help bashsupport, or you can read it online.

Of course, we've covered only a small part of Bash Support's functionality. The next time you need to whip up a shell script, try it using Vim with Bash Support. This plugin makes scripting in bash a lot easier.

[Sep 26, 2007] Beware Exotic tools can kill you!

Once and a while you may come across, what would seem to be a 'killer' piece of software, or maybe a cool new programming language - something in that would appear to give you some advantage.

That MAY be the case, but many times, it isn't really so - think twice before your leap!

Consider these points:

Do you notice a pattern here?

Yes, it's all about time. All this junk (software, programming languages, markup languages etc…) have one purpose in the end: to save you time.

Keep that in mind when you approach things - ask yourself:

'Will using this save me time?'

[Sep 26, 2007] Will Ruby kill PHP


OO is definitely overkill for a lot of web projects. It seems to me that so many people use OO frameworks like Ruby and Zope because "it's enterprise level". But using an 'enterprise' framework for small to medium sized web applications just adds so much overhead and frustration at having to learn the framework that it just doesnt seem worth it to me.

Having said all this I must point out that I'm distrustful of large corporations and hate their dehumanising heirarchical structure. Therefore i am naturally drawn towards open source and away from the whole OO/enterprise/heirarchy paradigm. Maybe people want to push open source to the enterprise level in the hope that they will adopt the technology and therefore they will have more job security. Get over it - go and learn Java and .NET if you want job security and preserve open source software as an oasis of freedom away from the corporate world. Just my 2c


OOP has its place, but the diversity of frameworks is just as challenging to figure out as a new class you didn't write, if not more. None of them work the same or keep a standard convention between them that makes learning them easier. Frameworks are great, but sometimes I think maybe they don't all have to be OO. I keep a small personal library of functions I've (and others have) written procedurally and include them just like I would a class.

Beyond the overhead issues is complexity. OOP has you chasing declarations over many files to figure out what's happening. If you're trying to learn how that unique class you need works, it can be time consuming to read through it and see how the class is structured. By the time you're done you may as well have written the class yourself, at least by then you'd have a solid understanding. Encapsulation and polymorphism have their advantages, but the cost is complexity which can equal time. And for smaller projects that will likely never expand, that time and energy can be a waste.

Not trying to bash OOP, just to defend procedural style. They each have their place.


Sorry, but I don't like your text, because you mix Ruby and Ruby on Rails alot. Ruby is in my oppinion easier to use then PHP, because PHP has no design-principle beside "make it work, somehow easy to use". Ruby has some really cool stuff I miss quite often, when I have to program in PHP again (blocks for example), but has a more clear and logic syntax.

Ruby on Rails is of course not that easy to use, at least when speaking about small-scale projects. This is, because it does alot more than PHP does. Of course, there are other good reasons to prefere PHP over Rails (like the better support by providers, more modules, more documentation), but from my opinion, most projects done in PHP from the complexity of a blog could profit from being programmed in Rails, from the pure technical point of view. At least I won't program in PHP again unless a customer asks me.


I have a reasonable level of experience with PHP and Python but unfortunately haven't touched Ruby yet. They both seem to be a good choice for low complexity projects. I can even say that I like Python a lot. But I would never consider it again for projects where design is an issue. They also say it is for (rapid) prototyping. My experience is that as long as you can't afford a proper IDE Python is maybe the best place to go to. But a properly "equipped" environment can formidably boost your productivity with a statically typed language like Java. In that case Python's advantage shrinks to the benefits of quick tests accesible through its command line.

Another problem of Python is that it wants to be everything: simple and complete, flexible and structured, high-level while allowing for low-level programming. The result is a series of obscure features

Having said all that I must give Python all the credits of a good language. It's just not perfect. Maybe it's Ruby :-)
My apologies for not sticking too closely to the subject of the article.


The one thing I hate is OOP geeks trying to prove that they can write code that does nothing usefull and nobody understands.

"You don't have to use OOP in ruby! You can do it PHP way! So you better do your homework before making such statements!"

Then why use ruby in the first place?

"What is really OVERKILL to me, is to know the hundrets of functions, PHP provides out of the box, and available in ANY scope! So I have to be extra careful whether I can use some name. And the more functions - the bigger the MESS."

On the other hand, in ruby you use only functions available for particular object you use.

I would rather say: "some text".length than strlen("some text"); which is much more meaningful! Ruby language itself much more descriptive. I remember myself, from my old PHP days, heaving always to look up the for appropriate function, but now I can just guess!"

Yeah you must have weak memory and can`t remember whether strlen() is for strings or for numbers….

Doesn`t ruby have the same number of functions just stored in objects?

Look if you can`t remember strlen than invent your own classes you can make a whole useless OOP framework for PHP in a day……

[Sep 26, 2007] Will Ruby on Rails kill .net and Java

www dot james mckay dot net

Ruby on Rails 1.1 has been released.

Dion Hinchcliffe has posted a blog entry at the end of which he asks the question, will it be a nail in the coffin for .net and Java?

Rails certainly looks beautiful. It is fully object oriented, with built in O/R mapping, powerful AJAX support, an elegant syntax, a proper implementation of the Model-View-Controller design pattern, and even a Ruby to Javascript converter which lets you write client side web code in Ruby.

However, I don't think it's the end of the line for C# and Java by a long shot. Even if it does draw a lot of fire, there is a heck of a lot of code knocking around in these languages, and there likely still will be for a very long time to come. Even throwaway code and hacked together interim solutions have a habit of living a lot longer than anyone ever expects. Look at how much code is still out there in Fortran, COBOL and Lisp, for instance.

Like most scripting languages such as Perl, Python, PHP and so on, Ruby is still a dynamically typed language. For this reason it will be slower than statically typed languages such as C#, C++ and Java. So it won't be used so much in places where you need lots of raw power. However, most web applications don't need such raw power in the business layer. The main bottleneck in web development is database access and network latency in communicating with the browser, so using C# rather than Rails would have only a very minor impact on performance. But some of them do, and in such cases the solutions often have different parts of the application written in different languages and even running on different servers. One of the solutions that we have developed, for instance, has a web front end in PHP running on a Linux box, with a back end application server running a combination of Python and C++ on a Windows server.

Rails certainly knocks the spots off PHP though…

[May 7, 2007] The Hundred-Year Language by Paul Graham

April 2003

(Keynote from PyCon2003)

...I have a hunch that the main branches of the evolutionary tree pass through the languages that have the smallest, cleanest cores. The more of a language you can write in itself, the better.

...Languages evolve slowly because they're not really technologies. Languages are notation. A program is a formal description of the problem you want a computer to solve for you. So the rate of evolution in programming languages is more like the rate of evolution in mathematical notation than, say, transportation or communications. Mathematical notation does evolve, but not with the giant leaps you see in technology.

...I learned to program when computer power was scarce. I can remember taking all the spaces out of my Basic programs so they would fit into the memory of a 4K TRS-80. The thought of all this stupendously inefficient software burning up cycles doing the same thing over and over seems kind of gross to me. But I think my intuitions here are wrong. I'm like someone who grew up poor, and can't bear to spend money even for something important, like going to the doctor.

Some kinds of waste really are disgusting. SUVs, for example, would arguably be gross even if they ran on a fuel which would never run out and generated no pollution. SUVs are gross because they're the solution to a gross problem. (How to make minivans look more masculine.) But not all waste is bad. Now that we have the infrastructure to support it, counting the minutes of your long-distance calls starts to seem niggling. If you have the resources, it's more elegant to think of all phone calls as one kind of thing, no matter where the other person is.

There's good waste, and bad waste. I'm interested in good waste-- the kind where, by spending more, we can get simpler designs. How will we take advantage of the opportunities to waste cycles that we'll get from new, faster hardware?

The desire for speed is so deeply engrained in us, with our puny computers, that it will take a conscious effort to overcome it. In language design, we should be consciously seeking out situations where we can trade efficiency for even the smallest increase in convenience.

Most data structures exist because of speed. For example, many languages today have both strings and lists. Semantically, strings are more or less a subset of lists in which the elements are characters. So why do you need a separate data type? You don't, really. Strings only exist for efficiency. But it's lame to clutter up the semantics of the language with hacks to make programs run faster. Having strings in a language seems to be a case of premature optimization.

... Inefficient software isn't gross. What's gross is a language that makes programmers do needless work. Wasting programmer time is the true inefficiency, not wasting machine time. This will become ever more clear as computers get faster

... Somehow the idea of reusability got attached to object-oriented programming in the 1980s, and no amount of evidence to the contrary seems to be able to shake it free. But although some object-oriented software is reusable, what makes it reusable is its bottom-upness, not its object-orientedness. Consider libraries: they're reusable because they're language, whether they're written in an object-oriented style or not.

I don't predict the demise of object-oriented programming, by the way. Though I don't think it has much to offer good programmers, except in certain specialized domains, it is irresistible to large organizations. Object-oriented programming offers a sustainable way to write spaghetti code. It lets you accrete programs as a series of patches. Large organizations always tend to develop software this way, and I expect this to be as true in a hundred years as it is today.

... As this gap widens, profilers will become increasingly important. Little attention is paid to profiling now. Many people still seem to believe that the way to get fast applications is to write compilers that generate fast code. As the gap between acceptable and maximal performance widens, it will become increasingly clear that the way to get fast applications is to have a good guide from one to the other.

...One of the most exciting trends in the last ten years has been the rise of open-source languages like Perl, Python, and Ruby. Language design is being taken over by hackers. The results so far are messy, but encouraging. There are some stunningly novel ideas in Perl, for example. Many are stunningly bad, but that's always true of ambitious efforts. At its current rate of mutation, God knows what Perl might evolve into in a hundred years.

... One helpful trick here is to use the length of the program as an approximation for how much work it is to write. Not the length in characters, of course, but the length in distinct syntactic elements-- basically, the size of the parse tree. It may not be quite true that the shortest program is the least work to write, but it's close enough that you're better off aiming for the solid target of brevity than the fuzzy, nearby one of least work. Then the algorithm for language design becomes: look at a program and ask, is there any way to write this that's shorter?

[Dec 15, 2006] Ralph Griswold died

Ralph Griswold, the creator of Snobol and Icon programming languages, died in October 2006 of cancer. Until recently Computer Science was a discipline where the founders were still around. That's changing. Griswold was an important pioneer of programming language design with Snobol sting manipulation facilities different and somewhat faster then regular expressions.
Lambda the Ultimate

Ralph Griswold died two weeks ago. He created several programming languages, most notably Snobol (in the 60s) and Icon (in the 70s) - both outstandingly innovative, integral, and efficacious in their areas. Despite the abundance of scripting and other languages today, Snobol and Icon are still unsurpassed in many respects, both as elegance of design and as practicality.

[Dec 15, 2006] Ralph Griswold

See also Ralph Griswold 1934-2006 and Griswold Memorial Endowment
Ralph E. Griswold died in Tucson on October 4, 2006, of complications from pancreatic cancer. He was Regents Professor Emeritus in the Department of Computer Science at the University of Arizona.

Griswold was born in Modesto, California, in 1934. He was an award winner in the 1952 Westinghouse National Science Talent Search and went on to attend Stanford University, culminating in a PhD in Electrical Engineering in 1962.

Griswold joined the staff of Bell Telephone Laboratories in Holmdel, New Jersey, and rose to become head of Programming Research and Development. In 1971, he came to the University of Arizona to found the Department of Computer Science, and he served as department head through 1981. His insistence on high standards brought the department recognition and respect. In recognition of his work the university granted him the breastle of Regents Professor in 1990.

While at Bell Labs, Griswold led the design and implementation of the groundbreaking SNOBOL4 programming language with its emphasis on string manipulation and high-level data structures. At Arizona, he developed the Icon programming language, a high-level language whose influence can be seen in Python and other recent languages.

Griswold authored numerous books and articles about computer science. After retiring in 1997, his interests turned to weaving. While researching mathematical aspects of weaving design he collected and digitized a large library of weaving documents and maintained a public website. He published technical monographs and weaving designs that inspired the work of others, and he remained active until his final week.

-----Gregg Townsend Staff Scientist The University of Arizona

[Dec 15, 2006] The mythical open source miracle by Neil McAllister

Actually Spolsky does not understand the role of scripting languages. But he is right of target with his critique of OO. Object oriented programming is no silver bullet.
Dec 14, 2006 | Computerworld

(InfoWorld) Joel Spolsky is one of our most celebrated pundits on the practice of software development, and he's full of terrific insight. In a recent blog post, he decries the fallacy of "Lego programming" -- the all-too-common assumption that sophisticated new tools will make writing applications as easy as snapping together children's toys. It simply isn't so, he says -- despite the fact that people have been claiming it for decades -- because the most important work in software development happens before a single line of code is written.

By way of support, Spolsky reminds us of a quote from the most celebrated pundit of an earlier generation of developers. In his 1987 essay "No Silver Bullet," Frederick P. Brooks wrote, "The essence of a software entity is a construct of interlocking concepts ... I believe the hard part of building software to be the specification, design, and testing of this conceptual construct, not the labor of representing it and testing the fidelity of the representation ... If this is true, building software will always be hard. There is inherently no silver bullet."

As Spolsky points out, in the 20 years since Brooks wrote "No Silver Bullet," countless products have reached the market heralded as the silver bullet for effortless software development. Similarly, in the 30 years since Brooks published " The Mythical Man-Month" -- in which, among other things, he debunks the fallacy that if one programmer can do a job in ten months, ten programmers can do the same job in one month -- product managers have continued to buy into various methodologies and tricks that claim to make running software projects as easy as stacking Lego bricks.

Don't you believe it. If, as Brooks wrote, the hard part of software development is the initial design, then no amount of radical workflows or agile development methods will get a struggling project out the door, any more than the latest GUI rapid-development toolkit will.

And neither will open source. Too often, commercial software companies decide to turn over their orphaned software to "the community" -- if such a thing exists -- in the naive belief that open source will be a miracle cure to get a flagging project back on track. This is just another fallacy, as history demonstrates.

In 1998, Netscape released the source code to its Mozilla browser to the public to much fanfare, but only lukewarm response from developers. As it turned out, the Mozilla source was much too complex and of too poor quality for developers outside Netscape to understand it. As Jamie Zawinski recounts, the resulting decision to rewrite the browser's rendering engine from scratch derailed the project anywhere from six to ten months.

This is a classic example of the fallacy of the mythical man-month. The problem with the Mozilla code was poor design, not lack of an able workforce. Throwing more bodies at the project didn't necessarily help; it may have even hindered it. And while implementing a community development process may have allowed Netscape to sidestep its own internal management problems, it was certainly no silver bullet for success.

The key to developing good software the first time around is doing the hard work at the beginning: good design, and rigorous testing of that design. Fail that, and you've got no choice but to take the hard road. As Brooks observed all those years ago, successful software will never be easy. No amount of open source process will change that, and to think otherwise is just more Lego-programming nonsense.

[Oct 26, 2006] Cobol Not Dead Yet

It's interesting that Perl is at 30% in this survey (as unscientific as it is)
What programming languages do you use in your organization? Choose all that apply.
Visual Basic - 67%
Cobol - 62%
Java - 61%
JavaScript - 55%
VB.Net - 47%
C++ - 47%
Perl - 30%
C - 26%
C# - 23%
ColdFusion - 15%
PHP - 13%
Fortran - 7%
PL/1 - 5%
Python - 5%
Pascal - 4%
Ada - 2%
Source: Computerworld survey of 352 readers

[Oct 25, 2006] Sun Gets Rubyfied by Jon Erickson

DDJ Portal Blog

On the heels of last weekend's Ruby Conference in Denver (for a report, see Jack Woehr's blog), Sun Microsystems made a Ruby-related announcement of its own. Led by Charles Nutter and Thomas Enebo, the chief maintainers of JRuby, a 100% pure Java implementation of the Ruby language, Sun has released JRuby 0.9.1. Among the features of this release are:

In related news, Ola Bini has been inducted into JRuby as a core developer during this development cycle.
Details are available at Thomas Enebo's blog and Ola Bini's blog.

[Sep 30, 2006] Ruby Book Sales Surpass Python

O'Reilly Radar

I was just looking at our BookScan data mart to update a reporter on Java vs. C# adoption. (The answer to his query: in the last twelve weeks, Java book sales are off 4% vs. the same period last year, while C# book sales are up 16%.) While I was looking at the data, though, I noticed something perhaps more newsworthy: in the same period, Ruby book sales surpassed Python book sales for the first time. Python is up 20% vs. the same period last year, but Ruby is up 1552%! (Perl is down 3%.) Perl is still the most commonly used of the three languages, at least according to book sales, but Python and now Ruby are narrowing the gap.

[Sep 30, 2006] Industry demands Python, not Ruby or Rails

Andrew L Smith

RoR, AJAX, SOA -- these are hype. The reality is that Java, JSP, PHP, Python, Perl, and Tcl/Tk are what people use these days. And if it's a web app, they use PHP or JSP.

RoR is too time-consuming. It takes too long to learn and provides too few results in that timeframe compared to a PHP developer.

AJAX is also time-consuming and assumes too much about the stability of the browser client. And it puts way too much power into ugly Javascript. It's good for only a smidgen of things in only a smidgen of environments.

Java and JSP is around only because of seniority. It was hyped and backed by big companies as the only other option to Microsoft's stuff. Now we're stuck with it. In reality, JSP and Java programmers spend too much time in meetings going over OOP minutea. PHP programmers may use a little OOP, but they focus most on just getting it done.

Python seems to have taken hold on Linux only as far as a rich client environment. It's not used enough for web apps.

By supermike, at 9:41 PM

[Sep 30, 2006] The departure of the hyper-enthusiasts

A couple of additional feedback posts
Weblogs Forum

Re: The departure of the hyper-enthusiasts Posted: Dec 18, 2005 5:54 PM

The issue at hand is comparable to the "to use or not to use EJB". I, too, had a bad time trying to use EJBs, so maybe you can demonstrate some simpathy for a now Ruby user who can't seem to use any other language.

I claim that Ruby is simple enough for me to concentrate on the problem and not on the language (primary tool). Maybe for you Python is cleaner, but to me Python is harder than Ruby when I try to read the code. Ruby has a nice convention of "CamelCase", "methods_names", "", etc, that make the code easier to read than a similar Python code, because in Python you don't have a good convention for that. Also, when I require 'afile.rb' in Ruby, it's much easier to read than the "import this.that.Something" in Python. Thus, despite the forced indentation, I prefer the way that the Ruby code looks and feels in comparison to the Python code.

On the supported libraries, Python has a very good selection, indeed. I would say that the Python libraries might be very good in comparison to the Ruby libraries. On the other hand, Ruby has very unique libraries which feel good to use. So, even if Python has more libraries, Ruby should have some quality libraries that compensate a lot for the difference. By considering that one should be well served using Ruby or Python in terms of libraries, the Python's force over Ruby diminishes quite a bit.

Finally, when you are up to the task of creating something new, like a library or program, you may be much more well served by using Ruby if you get to the point of fluid Ruby programming. But, if all you want is to create some web app, maybe Rails already fulfills your requirements.

Even if you consider Python better, for example, because Google uses it and you want to work for Google or something, that won't make us Ruby users give up on improving the language and the available tools. I simply love Ruby and I will keep using it for the forseeable future -- in the future, if I can make a major contribution to the Ruby community, I will.

[Sep 28, 2006] Rants Get Famous By Not Programming -- scripting language as a framework

The author is really incoherent in this rant (also he cannot be right by definition as he loves Emacs ;-). The key problem with arguments presented is that he is mixing apples with oranges (greatness of a programmer as an artist and greatness of a programmer as an innovator). Strong point of this runt is the idea that easy extensibility is a huge advantage and openness of code does not matter much per se. Another good obeservation (made by many other authors) is that "Any sufficiently flexible and programmable environment - say Emacs, or Ruby on Rails, or Firefox, or even my game Wyvern - begins to take on characteristics of ... operating system as it grows."
Stevey's Blog

Any sufficiently flexible and programmable environment - say Emacs, or Ruby on Rails, or Firefox, or even my game Wyvern - begins to take on characteristics of both language and operating system as it grows. So I'm lumping together a big class of programs that have similar characteristics. I guess you could call them frameworks, or extensible systems.

... ... ...

Not that we'd really know, because how often do we go look at the source code for the frameworks we use? How much time have you spent examining the source code of your favorite programming language's compiler, interpreter or VM? And by the time such systems reach sufficient size and usefulness, how much of that code was actually penned by the original author?

Sure, we might go look at framework code sometimes. But it just looks like, well, code. There's usually nothing particularly famous-looking or even glamorous about it. Go look at the source code for Emacs or Rails or Python or Firefox, and it's just a big ball of code. In fact, often as not it's a big hairy ball, and the original author is focused on refactoring or even rewriting big sections of it.

[Sep 28, 2006] Rants Blogger's Block #4 Ruby and Java and Stuff

Stevey's Blog

I was in Barnes today, doing my usual weekend stroll through the tech section. Helps me keep up on the latest trends. And wouldn't you know it, I skipped a few weeks there, and suddenly Ruby and Rails have almost as many books out as Python. I counted eleven Ruby/RoR titles tonight, and thirteen for Python (including one Zope book). And Ruby had a big display section at the end of one of the shelves.

Not all the publishers were O'Reilly and Pragmatic Press. I'm pretty sure there were two or three others there, so it's not just a plot by Tim O'Reilly to sell more books. Well, actually that's exactly what it is, but it's based on actual market research that led him to the conclusion that Rails and Ruby are both gathering steam like nobody's business.

... ... ...

I do a lot more programming in Python than in Ruby -- Jython in my game server, and Python at work, since that's what everyone there uses for scripting. I have maybe 3x more experience with Python than with Ruby (and 10x more experience with Perl). But Perl and Python both have more unnecessary conceptual overhead, so I find I have to consult the docs more often with both of them. And when all's said and done, Ruby code generally winds up being the most direct and succinct, whether it's mine or someone else's.

I have a lot of trouble writing about Ruby, because I find there's nothing to say. It's why I almost never post to the O'Reilly Ruby blog. Ruby seems so self-explanatory to me. It makes it almost boring; you try to focus on Ruby and you wind up talking about some problem domain instead of the language. I think that's the goal of all programming languages, but so far Ruby's one of the few to succeed at it so well.

... ... ...

I think next year Ruby's going to be muscling in on Perl in terms of mindshare, or shelf-share, at B&N.

[May 10, 2006] Google Code - Summer of Code

Among participating organization PHP , Python Software Foundation and Ruby Central See also Student FAQ, Mentor FAQ
Google Code

Welcome to the Summer of Code 2006 site. We are no longer accepting applications from students or mentoring organizations. Students can view previously submitted applications and respond to mentor comments via the student home page. Accepted student projects will be announced on on May 23, 2006. You can talk to us in the Summer-Discuss-2006 group or via IRC in #summer-discuss on SlashNET.

If you're feeling nostalgic, you can still access the Summer of Code 2005 site.

[May 2, 2006] Embeddable scripting with Lua

While interpreted programming languages such as Perl, Python, PHP, and Ruby are increasingly favored for Web applications -- and have long been preferred for automating system administration tasks -- compiled programming languages such as C and C++ are still necessary. The performance of compiled programming languages remains unmatched (exceeded only by the performance of hand-tuned assembly), and certain software -- including operating systems and device drivers -- can only be implemented efficiently using compiled code. Indeed, whenever software and hardware need to mesh seamlessly, programmers instinctively reach for a C compiler: C is primitive enough to get "close to the bare metal" -- that is, to capture the idiosyncrasies of a piece of hardware -- yet expressive enough to offer some high-level programming constructs, such as structures, loops, named variables, and scope.

However, scripting languages have distinct advantages, too. For example, after a language's interpreter is successfully ported to a platform, the vast majority of scripts written in that language run on the new platform unchanged -- free of dependencies such as system-specific function libraries. (Think of the many DLL files of the Microsoft® Windows® operating system or the many libcs of UNIX® and Linux®.) Additionally, scripting languages typically offer higher-level programming constructs and convenience operations, which programmers claim boost productivity and agility. Moreover, programmers working in an interpreted language can work faster, because the compilation and link steps are unnecessary. The "code, build, link, run" cycle of C and its ilk is reduced to a hastened "script, run."

Lua novelties

Like every scripting language, Lua has its own peculiarities:

Find more examples of Lua code in Programming in Lua and in the Lua-users wiki (for links, see the Resources section below).

As in all engineering pursuits, choosing between a compiled language and an interpreted language means measuring the pros and cons of each in context, weighing the trade-offs, and accepting compromises.

[Apr 10, 2006] Digg PHP's Scalability and Performance

O'Reilly ONLamp Blog

Several weeks ago there was a notable bit of controversy over some comments made by James Gosling, father of the Java programming language. He has since addressed the flame war that erupted, but the whole ordeal got me thinking seriously about PHP and its scalability and performance abilities compared to Java. I knew that several hugely popular Web 2.0 applications were written in scripting languages like PHP, so I contacted Owen Byrne - Senior Software Engineer at to learn how he addressed any problems they encountered during their meteoric growth. This article addresses the all-to-common false assumptions about the cost of scalability and performance in PHP applications.

At the time Gosling's comments were made, I was working on tuning and optimizing the source code and server configuration for the launch of Jobby, a Web 2.0 resume tracking application written using the WASP PHP framework. I really hadn't done any substantial research on how to best optimize PHP applications at the time. My background is heavy in the architecture and development of highly scalable applications in Java, but I realized there were enough substantial differences between Java and PHP to cause me concern. In my experience, it was certainly faster to develop web applications in languages like PHP; but I was curious as to how much of that time savings might be lost to performance tuning and scaling costs. What I found was both encouraging and surprising.

What are Performance and Scalability?

Before I go on, I want to make sure the ideas of performance and scalability are understood. Performance is measured by the output behavior of the application. In other words, performance is whether or not the app is fast. A good performing web application is expected to render a page in around or under 1 second (depending on the complexity of the page, of course). Scalability is the ability of the application to maintain good performance under heavy load with the addition of resources. For example, as the popularity of a web application grows, it can be called scalable if you can maintain good performance metrics by simply making small hardware additions. With that in mind, I wondered how PHP would perform under heavy load, and whether it would scale well compared with Java.

Hardware Cost

My first concern was raw horsepower. Executing scripting language code is more hardware intensive because to the code isn't compiled. The hardware we had available for the launch of Jobby was a single hosted Linux server with a 2GHz processor and 1GB of RAM. On this single modest server I was going to have to run both Apache 2 and MySQL. Previous applications I had worked on in Java had been deployed on 10-20 application servers with at least 2 dedicated, massively parallel, ultra expensive database servers. Of course, these applications handled traffic in the millions of hits per month.

To get a better idea of what was in store for a heavily loaded PHP application, I set up an interview with Owen Byrne, cofounder and Senior Software Engineer at From talking with Owen I learned gets on the order of 200 million page views per month, and they're able to handle it with only 3 web servers and 8 small database servers (I'll discuss the reason for so many database servers in the next section). Even better news was that they were able to handle their first year's worth of growth on a single hosted server like the one I was using. My hardware worries were relieved. The hardware requirements to run high-traffic PHP applications didn't seem to be more costly than for Java.

Database Cost

Next I was worried about database costs. The enterprise Java applications I had worked on were powered by expensive database software like Oracle, Informix, and DB2. I had decided early on to use MySQL for my database, which is of course free. I wondered whether the simplicity of MySQL would be a liability when it came to trying to squeeze the last bit of performance out of the database. MySQL has had a reputation for being slow in the past, but most of that seems to have come from sub-optimal configuration and the overuse of MyISAM tables. Owen confirmed that the use of InnoDB for tables for read/write data makes a massive performance difference.

There are some scalability issues with MySQL, one being the need for large amounts of slave databases. However, these issues are decidedly not PHP related, and are being addressed in future versions of MySQL. It could be argued that even with the large amount of slave databases that are needed, the hardware required to support them is less expensive than the 8+ CPU boxes that typically power large Oracle or DB2 databases. The database requirements to run massive PHP applications still weren't more costly than for Java.

PHP Coding Cost

Lastly, and most importantly, I was worried about scalability and performance costs directly attributed to the PHP language itself. During my conversation with Owen I asked him if there were any performance or scalability problems he encountered that were related to having chosen to write the application in PHP. A bit to my surprise, he responded by saying, "none of the scaling challenges we faced had anything to do with PHP," and that "the biggest issues faced were database related." He even added, "in fact, we found that the lightweight nature of PHP allowed us to easily move processing tasks from the database to PHP in order to deal with that problem." Owen mentioned they use the APC PHP accelerator platform as well as MCache to lighten their database load. Still, I was skeptical. I had written Jobby entirely in PHP 5 using a framework which uses a highly object oriented MVC architecture to provide application development scalability. How would this hold up to large amounts of traffic?

My worries were largely related to the PHP engine having to effectively parse and interpret every included class on each page load. I discovered this was just my misunderstanding of the best way to configure a PHP server. After doing some research, I found that by using a combination of Apache 2's worker threads, FastCGI, and a PHP accelerator, this was no longer a problem. Any class or script loading overhead was only encountered on the first page load. Subsequent page loads were of comparative performance to a typical Java application. Making these configuration changes were trivial and generated massive performance gains. With regard to scalability and performance, PHP itself, even PHP 5 with heavy OO, was not more costly than Java.


Jobby was launched successfully on its single modest server and, thanks to links from Ajaxian and TechCrunch, went on to happily survive hundreds of thousands of hits in a single week. Assuming I applied all of my new found PHP tuning knowledge correctly, the application should be able to handle much more load on its current hardware.

Digg is in the process of preparing to scale to 10 times current load. I asked Owen Byrne if that meant an increase in headcount and he said that wasn't necessary. The only real change they identified was a switch to a different database platform. There doesn't seem to be any additional manpower cost to PHP scalability either.

It turns out that it really is fast and cheap to develop applications in PHP. Most scaling and performance challenges are almost always related to the data layer, and are common across all language platforms. Even as a self-proclaimed PHP evangelist, I was very startled to find out that all of the theories I was subscribing to were true. There is simply no truth to the idea that Java is better than scripting languages at writing scalable web applications. I won't go as far as to say that PHP is better than Java, because it is never that simple. However it just isn't true to say that PHP doesn't scale, and with the rise of Web 2.0, sites like Digg, Flickr, and even Jobby are proving that large scale applications can be rapidly built and maintained on-the-cheap, by one or two developers.

Further Reading



[Mar 25, 2006] AutoHotkey - Free Mouse and Keyboard Macro Program with Hotkeys and AutoText

Kind of Expect for Windows. Nice addition to Windows.

AutoHotkey is a free, open-source utility for Windows. With it, you can:

Getting started might be easier than you think. Check out the quick-start tutorial.

More About Hotkeys

AutoHotkey unleashes the full potential of your keyboard, joystick, and mouse. For example, in addition to the typical Control, Alt, and Shift modifiers, you can use the Windows key and the Capslock key as modifiers. In fact, you can make any key or mouse button act as a modifier. For these and other capabilities, see Advanced Hotkeys.

Other Features

License: GNU General Public License

[Feb 14, 2006] OOP Criticism

Object Oriented Programming Oversold by B. Jacobs OOP criticism and OOP problems. The emperor has no clothes! Reality Check 101 Snake OOil

OOP Myths Debunked:

[Jan 31, 2006] A little anti-anti-hype - O'Reilly Ruby The departure of the hyper-enthusiasts by Bruce Eckel

It's kind of funny to see the negative opinion about Ruby, this "Perl with exceptions, co-routines and OO done right" from an OO evangelist, who managed to ride Java wave chanting standard mantras "Java is a great OO language", "OO programming Java is holy and the best thing on the Earth since sliced bread" to the fullest extent possible. All those cries were enthusiastically performed despite the fact the Bruce understands C++ well enough to feel all the deficiencies of Java from day one.
What the author misses here is that the length of the program (an indirect expression of the level of the language) is an extremely important measure of the success of the language design. And here Ruby beats Python and trash Java.
BTW by this measure Java with all its OO holyness is a miserable failure in comparison with Ruby or Python as most Java programs are even more verbose then the equivalent programs in C++.
Actually if you read Thinking in Java attentively you realize that Bruce Eckel is more language feature collector type of person (a type very good for sitting in the language standardization committees) that a software/language architect type of the person.
Also he by-and-large depends on the correct choice of "next hype wave" for the success of his consulting business and might be slightly resentful that Ruby recently has a lot of positive press when he bet on Python. But the problem for consultants like Bruce might be not about Python vs. Ruby, but that there might be no "the next wave".

December 18, 2005

Ruby is to Perl what C++ was to C. Ruby improves and simplifies the Perl language (the name "Ruby" is even a tribute to Perl), and adds workable OO features (If you've ever tried to use classes or references in Perl, you know what I'm talking about. I have no idea whether Perl 6 will rectify this or not, but I stopped paying attention long ago). But it also seems to carry forward some of the Perl warts. For anyone used to, and tired of, Perl, this certainly seems like a huge improvement, but I'm tired of all impositions by a language upon my thinking process, and so arbitrary naming conventions, reversed syntax and begin-end statements all seem like impediments to me.

But it's hard to argue that Ruby hasn't moved things forward, just like C++ and Java have. It has clearly shaken the Python community up a little; for a long time they were on the high ground of pure, clean language design. ... " Ruby, for example has coroutines, as I learned from Tate's book. The expression of coroutines in Ruby (at least, according to Tate's example) is awkward, but they are there, and I suspect that this may be why coroutines -- albeit in a much more elegant form -- are appearing in Python 2.5. Python's coroutines also allow straightforward continuations, and so we may see continuation servers implemented using Python 2.5.

... ... ...

... the resulting code has 20 times the visual bulk of a simpler approach. One of the basic tenets of the Python language has been that code should be simple and clear to express and to read, and Ruby has followed this idea, although not as far as Python has because of the inherited Perlisms. But for someone who has invested Herculean effort to use EJBs just to baby-sit a database, Rails must seem like the essence of simplicity. The understandable reaction for such a person is that everything they did in Java was a waste of time, and that Ruby is the one true path.

... ... ...

So -- sorry, Jim (Fulton, not Kirk) -- I'm going to find something drop-dead simple to solve my drop-dead simple problems. Probably PHP5, which actually includes most of Java and C++ syntax, amazingly enough, and I wonder if that isn't what made IBM adopt it.

... ... ...

However, I can't see Ruby, or anything other than C#, impacting the direction of the Java language, because of the way things have always happened in the Java world. And I think the direction that C# 3.0 may be too forward-thinking for Java to catch up to.

But here's something interesting. I was on the C++ standards committee from the initial meeting and for about 8 years. When Java burst on the scene with its onslaught of Sun marketing, a number of people on the standards committee told me they were moving over to Java, and stopped coming to meetings. And although some users of Python like Martin Fowler (who, it could be argued, was actually a Smalltalk programmer looking for a substitute, because the Smalltalk relationship never really worked out in the real world) have moved to Ruby, I have not heard of any of the rather significant core of Python language and library developers saying "hey, this Ruby thing really solves a lot of problems we've been having in Python, I'm going over there." Instead, they write PEPs (Python Enhancement Proposals) and morph the language to incorporate the good features.

Dick Ford Re: The departure of the hyper-enthusiasts Posted: Dec 20, 2005 9:05 PM
Reply to this message Reply
Remember, both Tate and Eckel make a living writing and talking about programming technologies. So both have to be looking down the road when Java goes into COBOL-like legacy status. It takes a lot of investment in time to learn a programming language and their libraries well enough to write and lecture about them. So if Ruby "is the one" to make into the enterprise eventually and Python never makes the leap, then that's a huge amount of re-tooling that Eckel has to do. It looks like he's trying to protect his Python investment.
Lars Stitz Re: The departure of the hyper-enthusiasts Posted: Dec 22, 2005 6:10 AM
Reply to this message Reply
The "hyper-enthusiasts", as they are euphemistically called by the article, are no more (and no less) than a bunch of consultants who want to advertise their expertise in order to gain more consulting gigs. They are not that outspoken because they know more or are brighter than other developers, but because their blogs and websites like Artima or TheServerSide grant them the benefit of publicity promotion at no cost.

Now, as mainstream development has moved to Javaland for good, these consultants are not needed anymore. Everybody and their dog can write a good Java application that uses decent frameworks and OR mappers and thus performs well in most tasks. So, more enthusiastic propaganda for Java does not pay the bill for these folks anymore -- they have to discover a new niche market where their service is still needed. In this case, what could be better than a language that only few people know yet? Instantly, the consultant's services appear valuable again!

My advice: Don't follow the hype unless you have good reasons to do so. "A cobbler should stick to his last," as we say in Germany. Sure, PHP, Perl, Ruby, C# have all their place in software development. But they are not to replace Java -- for now. For this, their benefits over Java still are too small.

Cheers, Lars

Kyrill Alyoshin Re: The departure of the hyper-enthusiasts Posted: Dec 18, 2005 1:00 PM
Reply to this message Reply
Very glad that you touched on the Tate's book. How about "I've never written a single EJB in my life" from an author of "Bitter EJB"?..

I am sensing quite a bit of commercial pressure from Bruce and his camp. They are simply not making enough margin teaching Java anymore. To do make that margin, you have to work hard, maybe not as hard as B. Eckel but still really hard: play with intricacies of the language, dig ever deeper and deeper, invest the time to write a book... But that's hard to do, kayaking is way more interesting.

So, it seems like Ruby has potential, why not throw a book or two at it, run a few $1000 a day courses... If you read "Beyond Java", this is exactly what Jason Hunter says in his interview.

I think mercantilism of "hyper-enthusiasts" is yet to be analyzed.

That said, I am not buying another Tate's book every again, no matter how "pragmatic" it is.

Jakub Pawlowicz Re: The departure of the hyper-enthusiasts Posted: Dec 18, 2005 5:01 PM
Reply to this message Reply
I think you are mainly right about reasons for people moving to Ruby, and its influence on Python and Java languages.
But by saying that "Java-on-rails might actually tempt me into creating a web app using Java again." and by comparing development in Ruby to the one in EJB 1/2 (or even EJB 3), you are missing the fact, that part of the server-side Java community has already moved to the lightweight approaches such as Spring Framework.

From a one and a half year experience of working as a Spring web developer I must admit that the server-side Java development could be much simpler with lightweight approaches than it was in the EJB 1/2 times.

Steven E. Newton Re: The departure of the hyper-enthusiasts Posted: Dec 19, 2005 9:37 PM
Reply to this message Reply
> Does anyone have an Open Source Ruby application they can
> point me to besides "ROR"( prefarbly a desktop app)? I
> wouldn't mind analysing some of its code, if one exists,
> so I can get a better sense of why its going to be a great
> desktop application programming language for me to use.

How about a Ruby/Cocoa application? I'm speaking of the graphical TestRunner for Ruby's Test::Unit I wrote:
It provides an interface similar to jUnit's for running ruby unit tests.

Also check out Rake and RubyGems. Actually any of the top projects on RubyForge are worth digging into.

DougHolton Re: The departure of the hyper-enthusiasts Posted: Dec 20, 2005 12:27 AM
Reply to this message Reply
I highly recommend skimming thru these short resources to get a much more in depth feel for what ruby is like, if you are already familiar with java or python like myself. I did recently and I am finally "getting" ruby much better (the perl-isms turned me off from seriously looking at it earlier, just like it took me a year to get over python's indenting):

Ruby user's guide:
10 things every java programmer should know about ruby:
Coming to ruby from java:

Things I like:
-you can be more expressive in ruby and essentially twist it into different domain-specific languages, see:
-I like how standalone functions essentially become protected extensions of the object class (like C# 3.0 extension methods):
-using "end" instead of curly braces (easier for beginners and more readable)

Things I don't like and never will:
-awkward syntax for some things like symbols and properties
-awful perlisms like $_,$$,$0,$1,?,<<,=begin
-80's style meaningless and over-abbreviated keywords and method names like "def", "to_s", "puts", etc.
-:: (double colon) vs. . (period).

Ruby is definitely better than python, but still not perfect, and still an order of magnitude slower than statically typed languages.

Re: The departure of the hyper-enthusiasts Posted: Dec 20, 2005 5:42 PM
Reply to this message Reply
> 'For example, the beautiful little feature where you can
> ask an array for its size as well as for its length
> (beautiful because it doesn't terrorize you into having to
> remember the exact precise syntax; it approximates it,
> which is the way most humans actually work),'
> you've intrigued me, which means I might be one of those
> programmers who lacks the imagination to see the
> difference between an arrays size and its length. :D What
> exactly is the difference?

There is no difference. The terms are synonymous. Ruby, being a common-sense oriented language, allows for synonymous terms without throwing a fit. It accommodates the way humans tend to think.

Java is the exact opposite. It is very stern, very non-commonsense oriented. It will throw a fit if you send the message 'length()' to an ArrayList. Although in the commonsense world, we all know what the meaning of the question: "what is your length?" should be for an ArrayList. Still, Java bureaucratically insists that our question is dead wrong, and that we should be asking it for its 'size()'. Java is absolutely non lenient.

Now, if you ask me, such boneheaded bureaucratic mindset is very dumb, very stupid. This is why anyone who develops in such bureaucratic languages feels their debilitating effects. And that's why switching to Ruby feels like a full-blown liberation!

James Watson Re: The departure of the hyper-enthusiasts Posted: Dec 20, 2005 5:56 PM
Reply to this message Reply
> Java is the exact opposite. It is very stern, very
> non-commonsense oriented. It will throw a fit if you send
> the message 'length()' to an ArrayList. Although in the
> commonsense world, we all know what the meaning of the
> question: "what is your length?" should be for an
> ArrayList. Still, Java bureaucratically insists that our
> question is dead wrong, and that we should be asking it
> for its 'size()'. Java is absolutely non lenient.

And then what? You give up? You go and cry? The world explodes? I don't get it. What's the big problem. You try to compile, the compiler says, "sorry, I don't get your meaning" and you correct the mistake. Is that really a soul-crushing experience? And that's in the stone-age when we didn't have IDEs for Java. Now you type '.' a list comes up and you select the appropriate method. Not that difficult.

Re: The departure of the hyper-enthusiasts Posted: Dec 20, 2005 6:19 PM
Reply to this message Reply
How many times have your project enabled you to create reusable components?

Without blackboxes, you will always be starting over and creating as many parts from scratch as needed.

Take Rails, for example. It's a framework built from components. One person was responsible for creating the main components, like HTTP Interface, ORM, general framework, etc. One person only! And the components were so good that people were able to use them with extreme ease (now known as "hype").

How many Java projects could have enjoyed a way to create good components, instead of poor frameworks and libraries that barely work together? I would say most Java projects could enjoy a componentized approach because they generally involve lots of very skilled people and lots of resources. :-)

What's a component compared to a library or a module? A component is a code that has a published interface and works like a blackbox -- you don't need to know how it works, only that it works. Even a single object can be a component, like said by Anders Hejlsberg (C#, Delphi):

"Anders Hejlsberg: The great thing about the word component is that you can just sling it about, and it sounds great, but we all think about it differently. In the simplest form, by component I just mean a class plus stuff. A component is a self-contained unit of software that isn't just code and data. It is a class that exposes itself through properties, methods, and events. It is a class that has additional attributes associated with it, in the form of metadata or naming patterns or whatever. The attributes provide dynamic additional information about how the component slots into a particular hosting environment, how it persists itself-all these additional things you want to say with metadata. The metadata enables IDEs to intelligently reason about what a component does and show you its documentation. A component wraps all of that up."

So, to me, components are truly the fine-grained units of code reuse. With Ruby, I not only can create my own components in a succinct way, but also can use its Domain Specific Language capabilities to create easy interfaces to use and exercise the components. All this happens in Rails. All this happens in my own libraries. And all this happens in the libraries of people who use Ruby. We are not starting our projects from scratch and hopping for the best. We are enjoying some powerful programmability!

Alex Bunardzic Re: The departure of the hyper-enthusiasts Posted: Dec 21, 2005 1:57 AM
Reply to this message Reply
> The length/size inconsistency has nothing to do with Java
> and everything to do with poor API design decisions made
> in 1995, probably by some very inexperienced programmer
> who had no idea that Java would become so successful.

This is akin to saying that the Inquisition had nothing to do with the fanaticism of the Catholic church, and everything to do with poor decisions some clergy made at that time. In reality, however, the Inquisition was inspired by the broader climate of the Catholic church fanaticism.

In the same way, poor API design that Java is infested with was/is directly inspired by the bureaucratic nature of the language itself.

Re: The departure of the hyper-enthusiasts Posted: Dec 22, 2005 12:58 AM
Reply to this message Reply
Ruby is good for Python

Because it offers more proof that dynamically typed, loosely coupled languages can more productive in creating robust solutions than statically typed, stricter languages with deeply nested class hierarchies. Java and C# essentially lead us through the same path for tackling problems. One may be a better version of the other (I like C# more) but the methodology is very similar. In fact the release of C# only validated the Java-style methodology by emulating it (albeit offering a more productive way to follow it).

Enter Python or Ruby, both different from the Java/C# style. Both producing 'enlightening' expreriences in ever growing list of seasoned, fairly well known static-style developers (Bruce Eckel, Bruce Tate, Martin Fowler...). As the knowledge spreads, it pokes holes in the strong Java/C# meme in peoples minds. Then people start to explore and experiment, and discover the Python (or Ruby) productivity gain. Some may prefer one, some the other. Ruby, in the end, validates the fact that Java/C# style methods may not be the best for everything, something the Python advocates have been saying for quite some time.

(copy from my posting at

Re: The departure of the hyper-enthusiasts Posted: Dec 22, 2005 5:21 AM
Reply to this message Reply
> The person I want to hear from is the core Python expert,
> someone who knows that language incredibly well ...

Ok Bruce(s). I'm not sure I qualify as your "core Python expert". I'm a core Python developer, though:

I've taken you up (well kinda), here are my rambling thoughts.

Notes on Postmodern Programming CS-TR-02-9 Authors: James Noble, Robert Biddle, Elvis Software Design Research Group ~Source: GZipped PostScript (1700kb); Adobe PDF (1798kb)\

These notes have the status of letters written to ourselves: we wrote them down because, without doing so, we found ourselves making up new arguments over and over again. When reading what we had written, we were always too satisfied. For one thing, we felt they suffered from a marked silence as to what postmoderism actually is. Yet, we will not try to define postmodernism, first because a complete description of postmodernism in general would be too large for the paper, but secondly (and more importantly) because an understanding of postmodern programming is precisely what we are working towards. Very few programmers tend to see their (sometimes rather general) difficulties as the core of the subject and as a result there is a widely held consensus as to what programming is really about. If these notes prove to be a source of recognition or to give you the appreciation that we have simply written down what you already know about the programmer's trade, some of our goals will have been reached.

[Jan 31, 2006] A little anti-anti-hype - O'Reilly Ruby

Everyone's buzzing about Bruce Eckel's "anti-hype" article. I hope the irony isn't lost on him.

... ... ...

First, inferior languages and technologies are just as likely to win. Maybe even more likely, since it takes less time to get them right. Java beat Smalltalk; C++ beat Objective-C; Perl beat Python; VHS beat Beta; the list goes on. Technologies, especially programming languages, do not win on merit. They win on marketing. Begging for fair, unbiased debate is going to get your language left in the dust.

You can market a language by pumping money into a hype machine, the way Sun and IBM did with Java, or Borland did back with Turbo Pascal. It's pretty effective, but prohibitively expensive for most. More commonly, languages are marketed by a small group of influential writers, and the word-of-mouth hyping extends heirarchically down into the workplace, where a bunch of downtrodden programmers wishing they were having more fun stage a coup and start using a new "forbidden" language on the job. Before long, hiring managers start looking for this new language on resumes, which drives book sales, and the reactor suddenly goes supercritical.

Perl's a good example: how did it beat Python? They were around at more or less the same time. Perl might predate Python by a few years, but not enough for it to matter much. Perl captured roughly ten times as many users as Python, and has kept that lead for a decade. How? Perl's success is the result of Larry Wall's brilliant marketing, combined with the backing of a strong publisher in O'Reilly.

"Programming Perl" was a landmark language book: it was chatty, it made you feel welcome, it was funny, and you felt as if Perl had been around forever when you read it; you were just looking at the latest incarnation. Double marketing points there: Perl was hyped as a trustworthy, mature brand name (like Barnes and Noble showing up overnight and claiming they'd been around since 1897 or whatever), combined with that feeling of being new and special. Larry continued his campaigning for years. Perl's ugly deficiencies and confusing complexities were marketed as charming quirks. Perl surrounded you with slogans, jargon, hip stories, big personalities, and most of all, fun. Perl was marketed as fun.

What about Python? Is Python hip, funny, and fun? Not really. The community is serious, earnest, mature, and professional, but they're about as fun as a bunch of tax collectors.

... ... ...

Pedantry: it's just how things work in the Python world. The status quo is always correct by definition. If you don't like something, you are incorrect. If you want to suggest a change, put in a PEP, Python's equivalent of Java's equally glacial JSR process. The Python FAQ goes to great lengths to rationalize a bunch of broken language features. They're obviously broken if they're frequently asked questions, but rather than 'fessing up and saying "we're planning on fixing this", they rationalize that the rest of the world just isn't thinking about the problem correctly. Every once in a while some broken feature is actually fixed (e.g. lexical scoping), and they say they changed it because people were "confused". Note that Python is never to blame.

In contrast, Matz is possibly Ruby's harshest critic; his presentation "How Ruby Sucks" exposes so many problems with his language that it made my blood run a bit cold. But let's face it: all languages have problems. I much prefer the Ruby crowd's honesty to Python's blaming, hedging and overt rationalization.

As for features, Perl had a very different philosophy from Python: Larry would add in just about any feature anyone asked for. Over time, the Perl language has evolved from a mere kitchen sink into a vast landfill of flotsam and jetsam from other languages. But they never told anyone: "Sorry, you can't do that in Perl." That would have been bad for marketing.

Today, sure, Perl's ugly; it's got generations of cruft, and they've admitted defeat by turning their focus to Perl 6, a complete rewrite. If Perl had started off with a foundation as clean as Ruby's, it wouldn't have had to mutate so horribly to accommodate all its marketing promises, and it'd still be a strong contender today. But now it's finally running out of steam. Larry's magical marketing vapor is wearing off, and people are realizing that Perl's useless toys (references, contexts, typeglobs, ties, etc.) were only fun back when Perl was the fastest way to get things done. In retrospect, the fun part was getting the job done and showing your friends your cool software; only half of Perl's wacky features were helping with that.

So now we have a void. Perl's running out of steam for having too many features; Java's running out of steam for being too bureaucratic. Both are widely beginning to be perceived as offering too much resistance to getting cool software built. This void will be filled by... you guessed it: marketing. Pretty soon everyone (including hiring managers) will see which way the wind is blowing, and one of Malcolm Gladwell's tipping points will happen.

We're in the middle of this tipping-point situation right now. In fact it may have already tipped, with Ruby headed to become the winner, a programming-language force as prominent on resumes and bookshelves as Java is today. This was the entire point of Bruce Tate's book. You can choose to quibble over the details, as Eckel has done, or you can go figure out which language you think is going to be the winner, and get behind marketing it, rather than complaining that other language enthusiasts aren't being fair.

Could Python be the next mega-language? Maybe. It's a pretty good language (not that this really matters much). To succeed, they'd have to get their act together today. Not in a year, or a few months, but today -- and they'd have to realize they're behind already. Ruby's a fine language, sure, but now it has a killer app. Rails has been a huge driving and rallying force behind Ruby adoption. The battleground is the web framework space, and Python's screwing it up badly. There are at least five major Python frameworks that claim to be competing with Rails: Pylons, Django, TurboGears, Zope, and Subway. That's at least three (maybe four) too many. From a marketing perspective, it doesn't actually matter which one is the best, as long as the Python community gets behind one of them and starts hyping it exclusively. If they don't, each one will get 20% of the developers, and none will be able to keep pace with the innovation in Rails.

The current battle may be over web frameworks, but the war is broader than that. Python will have to get serious about marketing, which means finding some influential writers to crank out some hype books in a hurry. Needless to say, they also have to abandon their anti-hype position, or it's a lost cause. Sorry, Bruce. Academic discussions won't get you a million new users. You need faith-based arguments. People have to watch you having fun, and envy you.

My guess is that the Python and Java loyalists will once again miss the forest for the trees. They'll debate my points one by one, and declare victory when they've proven beyond a doubt that I'm mistaken: that marketing doesn't really matter. Or they'll say "gosh, it's not really a war; there's room for all of us", and they'll continue to wonder why the bookshelves at Barnes are filling up with Ruby books.

I won't be paying much attention though, 'cuz Ruby is soooo cool. Did I mention that "quit" exits the shell in Ruby? It does, and so does Ctrl-D. Ruby's da bomb. And Rails? Seriously, you don't know what you're missing. It's awesome. Ruby's dad could totally beat up Python's dad. Check out Why's Poignant Guide if you don't believe me. Ruby's WAY fun -- it's like the only language I want to use these days. It's so easy to learn, too. Not that I'm hyping it or anything. You just can't fake being cool.

[Jan 28, 2006] Draft of the paper "In Praise of Scripting: Real Programming Pragmatism" by Ronald P. Loui, Associate Professor of CSE, Washington University in St. Louis.

This article's main purpose is to review the changes in programming practices known collectively
as the "rise of scripting," as predicted in 1998 IEEE COMPUTER by Ousterhout. This attempts to be both brief and definitive, drawing on many of the essays that have appeared in online forums. The main new idea is that programming language theory needs to move beyond semantics and take language pragmatics more seriously.

... ... ...

Part of the problem is that scripting has risen in the shadow of object-oriented programming and highly publicized corporate battles between Sun, Netscape, and Microsoft with their competing software practices. Scripting has been appearing language by language, including object-oriented scripting languages now. Another part of the problem is that scripting is only now mature enough to stand up against its legitimate detractors. Today, there are answers to many of the persistent questions about scripting:

Embed Lua for scriptable apps

The Lua programming language is a small scripting language specifically designed to be embedded in other programs.

Lua's C API allows exceptionally clean and simple code both to call Lua from C, and to call C from Lua.

This allows developers who want a convenient runtime scripting language to easily implement the basic API elements needed by the scripting language, then use Lua code from their applications.

This article introduces the Lua language as a possible tool for simplifying common development tasks, and discusses some of the reasons to embed a scripting language in the first place.

Hope For Multi-Language Programming

chthonicdaemon is pretty naive and does not understand that combination of scripting language with complied language like C (or semi-compiled language like Java) is more productive environment that almost any other known... You need a common runtime like in Windows to make it a smooth approach (IronPython). Scripting helps to avoid OO trap that is pushed by "a hoard of practically illiterate researchers publishing crap papers in junk conferences."
"I have been using Linux as my primary environment for more than ten years. In this time, I have absorbed all the lore surrounding the Unix Way - small programs doing one thing well, communicating via text and all that. I have found the command line a productive environment for doing many of the things I often do, and I find myself writing lots of small scripts that do one thing, then piping them together to do other things. While I was spending the time learning grep, sed, awk, python and many other more esoteric languages, the world moved on to application-based programming, where the paradigm seems to be to add features to one program written in one language. I have traditionally associated this with Windows or MacOS, but it is happening with Linux as well. Environments have little or no support for multi-language projects - you choose a language, open a project and get it done. Recent trends in more targeted build environments like cmake or ant are understandably focusing on automatic dependency generation and cross-platform support, unfortunately making it more difficult to grow a custom build process for a multi-language project organically. All this is a bit painful for me, as I know how much is gained by using a targeted language for a particular problem. Now the question: Should I suck it up and learn to do all my programming in C++/Java/(insert other well-supported, popular language here) and unlearn ten years of philosophy, or is there hope for the multi-language development process?"

[02/16/2006] Introducing Lua by Keith Fieldhouse

What if you could provide a seamlessly integrated, fully dynamic language with a conventional syntax while increasing your application's size by less than 200K on an x86? You can do it with Lua!

There's no reason that web developers should have all the fun. Web 2.0 APIs enable fascinating collaborations between developers and an extended community of developer-users. Extension and configuration APIs added to traditional applications can generate the same benefits.

Of course, extensibility isn't a particularly new idea. Many applications have a plugin framework (think Photoshop) or an extension language (think Emacs). What if you could provide a seamlessly integrated, fully dynamic language with a conventional syntax while increasing your application's size by less than 200K on an x86? You can do it with Lua!

Lua Basics

Roberto Ierusalimschy of the Pontifical Catholic University of Rio de Janeiro in Brazil leads the development of Lua. The most recent version (5.0.2; version 5.1 should be out soon) is made available under the MIT license. Lua is written in 99 percent ANSI C. Its main design goals are to be compact, efficient, and easy to integrate with other C/C++ programs. Game developers (such as World of Warcraft developer Blizzard Entertainment) are increasingly using Lua as an extension and configuration language.

Virtually anyone with any kind of programming experience should find Lua's syntax concise and easy to read. Two dashes introduce comments. An end statement delimits control structures (if, for, while). All variables are global unless explicitly declared local. Lua's fundamental data types include numbers (typically represented as double-precision floating-point values), strings, and Booleans. Lua has true and false as keywords; any expression that does not evaluate to nil is true. Note that 0 and arithmetic expressions that evaluate to 0 do not evaluate to nil. Thus Lua considers them as true when you use them as part of a conditional statement.

Finally, Lua supports userdata as one of its fundamental data types. By definition, a userdata value can hold an ANSI C pointer and thus is useful for passing data references back and forth across the C-Lua boundary.

Despite the small size of the Lua interpreter, the language itself is quite rich. Lua uses subtle but powerful forms of syntactic sugar to allow the language to be used in a natural way in a variety of problem domains, without adding complexity (or size) to the underlying virtual machine. The carefully chosen sugar results in very clean-looking Lua programs that effectively convey the nature of the problem being solved.

The only built-in data structure in Lua is the table. Perl programmers will recognize this as a hash; Python programmers will no doubt see a dictionary. Here are some examples of table usage in Lua:

a      = {}       -- Initializes an empty table
a[1]   = "Fred"   -- Assigns "Fred" to the entry indexed by the number 1
a["1"] = 7        -- Assigns the number 7 to the entry indexed by the string "1"

Any Lua data type can serve as a table index, making tables a very powerful construct in and of themselves. Lua extends the capabilities of the table by providing different syntactic styles for referencing table data. The standard table constructor looks like this:

t = { "Name"="Keith", "Address"="Ballston Lake, New York"}

A table constructor written like

t2 = { "First", "Second","Third"}

is the equivalent of

t3 = { 1="First", 2="Second", 3="Third" }

This last form essentially initializes a table that for all practical purposes behaves as an array. Arrays created in this way have as their first index the integer 1 rather than 0, as is the case in other languages.

The following two forms of accessing the table are equivalent when the table keys are strings:

t3["Name"] = "Keith"
t3.Name    = "Keith"

Tables behave like a standard struct or record when accessed in this fashion.

SymbianOne Feature

Simkin started life in 1995. At that time Lateral Arts' Simon Whiteside was involved in the development of "Animals of Farthing Wood" an adventure game being produced by the BBC. Simon was asked to produce the game code. "When I started the project it became clear that while the games designers had clear objectives for what they were trying to achieve the detail of much of the game were not defined," says Simon. "Faced with the prospect of rewriting section of the games as the design progressed, which written in C running on Windows 3.0, I realized was going to be time consuming, I looked for some alternative solutions." Simon's initial solution was to allow the game to be manipulated using configuration files, but as time progressed the need for an expression evaluator was identified and later the loops were added to give greater control and flexibility and so the scripting language emerged.

From the Farthing Wood project Simon took this technology with him to a project for Sibelius the best selling music notation application. The developers of Sibelius wanted to add a macro language, to provide Sibelius with a macro capability similar to the facilities available in a word processor. Simon created this feature using Simkin to provide the Sibelius plug-in.

When Simon left Sibelius in 1997 he decided to make Simkin available as a product and after productizing it spent about 6 months working on licensing the product. In that period he sold a couple of licenses but eventually realized that his core interest was in bespoke applications development. Rather than let the product die Simon made the decision to release it as an open source project. So in 1999 it was released through Sourceforge. "Simkin certainly gained interest as an open source product," says Simon. "I received a lot of feed back and several bug fixes so I was happy that open source was the right way to go with Simkin."

Since Simon open sourced Simkin he has developed Java and XML versions as well as pilot J2ME version.

The Symbian version started with an inquiry from Hewlett-Packard in early 2002. Hewlett-Packard Research Laboratories, Europe were running the Bristol Wearable Computing Project in partnership with Bristol University. The project involves looking at various applications for wearable computing devices from applications such as games to guides. One application provides a guide to the works in the city art gallery fed with information from wireless access points, which had been set up around Bristol. As part of the project HP wanted to build an interactive game to run on the HP iPAQ. To provide the games with a simple mechanism to customize it, they approached Simon to port Simkin to the iPAQ and so provide the ability to use XML schemas to describe elements of the game.

"Once we had done that HP wanted to extend the project to use phones," says Simon. "They had identified Symbian OS phones as the emerging technology in this arena and they asked me to do a port." Through contacts Simon approached Symbian who provided comprehensive support in porting Simkin. However HP did not proceed with the use of Symbian phones in the wearables project, although Simon notes that there has been a fair amount of interest from Symbian developers since the port was made available through Sourceforge

In porting to Symbian Simon wanted to retain source code compatibility with the other versions of Simkin. "Maintaining compatibility created two main challenges due to the fact that Symbian C++ does not include the ability to process C++ exceptions and you cannot use the Symbian leave process in a C++ constructor," says Simon. "I managed to overcome most of these problems by using C++ macros, part of which I had started for the HP port as Windows CE also lacked support for exceptions. In the most part this approached worked but still there were some placed that needed particular code for Symbian."

Simkin is not a language that can be used to develop applications from scratch. As Simon describes it "Simkin is a language that can be used to configure application behavior, I call it an embeddable scripting language. So it bolts onto an application to allow a script to make the final decisions about the applications behavior or allows users to control aspects of what the application does, but the real functionality is still in the host application." Simon believes Simkin is well suited to games, where performance is an issue, as the intrinsic games functions can be developed in C or C++ but then controlled by the light weight Simkin. "Using a conventional scripting language would simply not be possible for that type of application," says Simon.


Recommended Links

Softpanorama hot topic of the month

Softpanorama Recommended

Top articles


Classic Papers Icon Rebol S-Lang History Humor Etc.



Major sites and directories


Python Programming Language

Ruby Home Page

Dave Winer's Scripting News Weblog

OOP Oversold


The classic language from the creator of Snobol. Icon introduced many interesting constructs (generators), moreover Icon constructs were done right unlike similar attempts in Perl and Python.

The Icon Programming Language -- the main page

FAQ - Icon Programming Language


Random Findings

Remark about the danger of mixing of arbitary languages in large projects (in this case Java and Perl). This is true that Perl and Java are an odd couple for a large project (everybody probably would be better off using Jython for this particular project ;-)

Scripting In Large Projects (Score:2)
by Jboy_24 (88864) on Monday February 24, @04:02PM (#5373095)

is like a bacterial infestation. I worked on a large Perl based ecommerce project and a large Java based Ecommerce project. In the end, to insure quality code we had 100% make sure use strict was used, we had to forbid many things Perl programmers pride themselves on in order to get 8 developers to stop duplicating work, stepping on each others code and make our code malleable to changes in specs.

In Java project it was sooo much easier. Sure it took a little longer to start up, creating the Beans, the database layer etc, but once we were going everyone used the code we created, adding features and dealing with changing specs were SOO much easier.

Now comes to the point of the title, we were on a tight deadline, so the bosses got a team from another part of the company to write a PDF generator. That piece came in Perl. Now the piece was written by good, skilled programmers, but dealing with different error log locations, creating processes for the Perl interpreter to live in etc was a nightmare. If we paid the $$ for a 3rd Party Java PDF writer or developed our own we could have saved a good 2-3 man months off of the code. I learned pretty quickly as the only 'Perl' guy on the Java side of the project, You should NEVER, EVER mix languages in a project.

Scripting languages are fine for small one-two page cgi programs, but unless you can crack a whip and get the programmers to fall in line, you'd better let the language and environment do that.

btw, J2EE are frustrating to Script Programmers because they were DESIGNED to be. But if you were ever in charge of divying out tasks in a large project you'll realize how J2EE was designed for you.


FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes.   If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner. 

ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.  


Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy


War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes


Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law


Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least

Copyright © 1996-2016 by Dr. Nikolai Bezroukov. was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.

The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting development of this site and speed up access. In case is down you can use the at


The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

Last modified: September 18, 2017