Revolt against overcomplexity -- "back to basics" movement in programming and Unix system administration

Countermovement for the preservation of Unix philosophy and command line methods of managing the system.
See KISS Principle and The tar pit of Red Hat overcomplexity for more information

News KISS Principle Recommended Books Recommended Links The tar pit of Red Hat overcomplexity Unix Configuration Management Tools Unix System Monitoring Overcomplexity of Enterprise Linux distributions
Defensive programming Conceptual Integrity Systemd invasion into Linux Server space Conway Law Parkinson Law Featuritis Premature Optimization is the root of all evil The Second System Effect
Unix Component Model Classic Unix Utilities Pipes Scripting Regular Expressions -- the most popular nonprocedural language Usage of pipes with loops in shell bash Tips and Tricks Cargo cult programming
Introduction to the Unix shell history Bourne Shells Family The Art of Computer Programming Donald Knuth Algorithms and Data Structures Compilers Algorithms   An observation about corporate security departments
Less is More: A rich functionality behind Spartan interface of Orthodox File Managers Orthodox File Managers Midnight Commander version 4.8 Midnight Commander Tips and Tricks WinSCP      
Real Insights into Architecture Come Only From Actual Programming GUI vs. Command line interface C language Assembler is not for Dummies Language Design and Programming Quotes History Humor Etc
 

"Those who don't understand UNIX are doomed to reinvent it, poorly." --Henry Spencer

software bloat

   <jargon, abuse> The result of adding new features to a program  or system to the point where the benefit of the new features  is outweighed by the extra resources consumed (RAM, disk   space or performance) and complexity of use. 

Software bloat  is an instance of Parkinson's Law: resource requirements  expand to consume the resources available.  Causes of software  bloat include second-system effect and creeping  featuritis.

Commonly cited examples include Unix's "ls(1)"  command, the X Window System, BSD, Missed'em-fiveOS/2 and any Microsoft product.

creeping featurism,

with its own spoonerization: `feeping  creaturitis'.  Some people like to reserve this form for the  disease as it actually manifests in software or hardware, as  opposed to the lurking general tendency in designers' minds.   (After all, -ism means `condition' or `pursuit of', whereas   -itis usually means `inflammation of'.)

The original Jargon file


Introduction

Back-to-basics movement initially emerged as an educational movement stressing the necessity of the a return of American public schools to a fundamental core curriculum based on English, mathematics, science and history. As well as elimination  so-called educational “frills,” such as home economics and other personal improvement courses. Launched in the early 1970s, its roots lie in the early 20th-century “essentialism” movement.

Back-to-basics movement in education advocates would reinstitute strict classroom discipline and give primary and secondary school students little or no choice over what courses they would study. Although there are some differences regarding which courses constitute the required basics, they generally include those that provide "classic" academic skills such as algebra and geometry courses in math, as well as separate courses in chemistry and physics. Proponents of back-to-basics movement in  education condemn “cafeteria-style” curricula that offer students a wide choice of personal-improvement courses and a wide selection of “soft” academic courses such as media studies instead of fundamental courses in history, including the history of ancient Greece and Roman empire (I would add to this the history of World War II). 

"Back-to-basics" movement also exists in IT . Elements of this revolt are visible in several domains:

Despite of the fact that the "back to basic" movement has been the subject of comments by several IT journalists and writers (especially in context of systemd controversy) little systematic attention has been given to its study as an important social protest movement within modern IT.

Back to basic movement in software

In software development proponents of "back to basic" movement claim that excessive software complexity, which is the hallmark of Windows environment and now is crippling into Linux environment is counterproductive. For Linux the side effect is that it destroys "Unix philosophy" and Unix Component Mode (based on integration with shell small well defined utilities (some "off the shelf, some written specifically for the task in hand).  See for example the discussion of problems with Unix configuration management systems in Unix Configuration Management Tools

The success of GOlang had shown that simpler languages without too many OO frills can complete with complex monsters like Python ("Perl for housewives" ;-)  even in such complex environment as modern Linux. This is especially true for developers of software (Why Golang is so Popular Among Developers- - GeeksforGeeks):

Golang, also known as “Go, ” is a compiled language, fast and high-performance language intended to be simple and is designed to be easy to read and understand. Go was created at Google by Rob Pike, Robert Griesemer, and Ken Thompson, and it first appeared in Nov 2009. The syntax of Golang is designed to be highly clean and accessible.

Several large development organization recently switched from Python to Golang (Salesforce, Stream). Docker was written in GOlang. The same is true for a large part of Kubernetes.

The way coroutines are used in Unix conceptual model allows to simplify the development and debugging of complex systems that can be conceptually deposed into several stages. In this case stages can be implemented as filters,  connected using pipes. For debugging pipes can be replaced with the stored files and this increases both visibility and convenience  of debugging, streamlines and simplify the development. 

The situation is not black and white, though. Complexity of environment (for example of the modern operating system) means that complex non-orthogonal programming languages such as Perl and Python have their place in for some tasks are a reasonable compromise. That's why Previously Perl and now Python enjoy considerable popularity (although it still did not match the popularity of C  in TIOBE Index for May 2021 ;-) 

LAMP stack emerged as dominant way to develop web applications and in its core it has nothing to do with Unix component model. Similarly Kubernetes emerged as a reasonable way to create fault tolerant environment for running applications ("high availability cluster") and despite its tremendous complexity it enjoy considerable success as high availability cluster for applications. Still, despite all the limitations.  Unix component model continues to provides a viable environment for Software Prototyping.

Proponents of "back to basic" movement does not deny the necessity of creating complex software application.  Complexity of software after all reflects the complexity of the environment in which it operates. what is important is to avoid "excesses" such as overgeneralization and attempt to kill two birds with one stone.  Adding components that contradict the guising principle of the original architecture (such as replacing initd with systemd in Linux) does not help and eventually may lead to problems although with enough trusts pigs can fly. And the results can be very disappointing:

Patrick Armstrong says: June 1, 2021 at 7:32 pm

Can the MIC make anything other than cost over-runs these days?

d74 says: June 1, 2021 at 11:38 pm

The answer is too easy: no. Not only are the costs insane, but the functionality is insufficient. Simply put, it doesn’t work or seem unfit for fighting. Stacking technologies is a dream that does not stand up to warfare realities. ‘Keep it simple’ seems out of reach.

I followed the adoption of the 120mm mortar by USMC. They started with a good weapon, with confirmed potential. The end point was tactical paralysis. This is (was) a very small issue, and an old one. It is significant.

The claim that managing complexity is the second most important responsibility of software developer after reliability and is intrinsically  connected with maintainability of the software system originated in the US airspace engineering in the form of  so called KISS Principle ("Keep It Simple Stupid").

The word "stupid" in this acronym means that the equipment/software produced needs to be serviced at the field by people far less sophisticated than the designers of the equipment/software and if they can't do it that limits its usefulness. For military aircraft it determined it efficiency at war and the same or higher importance as the metrics that position it against enemy aircraft flying characteristics.  The same is true for tanks which was clearly demonstrated with German Tigers and Panther tanks, where low reliability hampered their  battlefield efficiency despite clear technical superiority over Russian T34 and allied tanks.  So for software systems that need to be maintained locally interests of maintenance personnel need to be accommodated during  the design and if possible to the fullest degree. Any such large software package that is in production for a number of years needs to be maintained and adapted to the new generations of hardware. That's given.

In this sense I think that concept of Defensive programming should be viewed as a part of "Back to basic" movement in software.  It puts stress of checking all possible error conditions and especially on  generating meaningful diagnostic message (the areas by and large neglected in modern software developing in its eternal search for expanding functionality and new features)

From this perspective the proponents of "back to basic" movement object to object-oriented "lasagna" programming style that is a typical side effect of abusing OO programming paradigm.  Excessive attempt to make classes reusable and generic inevitably lead to way too many layers of classes. To the code that is can be hard to understand, extremely slow, and hard to maintain, completely defeating the original goal of introducing OO (see Object-Oriented Cult: A Slightly Skeptical View on the Object-Oriented Programming).  The sheer horror when you need to port or renovate such software and look at the impenetrable maze of classes and modules, some of which are no longer supported, is well known to anybody who participated is such projects.  Often the only way to deal with this problem is to view the old system as a prototype and re-implement it from scratch discarding the old codebase.

Jamie Zawinski crusade against C++ overcomplexity

NOTE: based on Jamie Zawinski - Wikipedia

Jamie Zawinski is a well known programmer who used to work in Netscape. He was a founder of Mozilla.org, personally registering its domain name. Most of his projects are written in Perl and C.[ While still working for Netscape, Zawinski was known for his dislike of C++. There have been reports about him expressing his anger by throwing a chair across a conference room. In his post-Netscape life, he continued to proselytize against C++. In Peter Seibel's book "Coders at Work: Reflections on the Craft of Programming", Zawinski calls C++ an abomination. Furthermore, he believes C++ to be responsible for bloat and compatibility problems in Netscape 4.0 because when programming in C++ all project members have to agree on a subset and "no one can ever agree on which ten percent of the language is safe to use".  According to Zawinski, his dislike towards C++ stems from the fact that the language is too complex:

When you’re programming C++ no one can ever agree on which ten percent of the language is safe to use. There’s going to be one guy who decides, “I have to use templates.” And then you discover that there are no two compilers that implement templates the same way.

Also, Zawinski criticizes several language and library deficiencies he encountered while programming in Java, precisely an overhead of certain classes but also a lack of features such as C-like assertions and typedefs. Ultimately Zawinski returned to programming in C "since it's still the only way to ship portable programs."

Zawinski's Law of Software Envelopment (also known as Zawinski's Law) relates the pressure of popularity to the phenomenon of software bloat:

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.

Examples of the law in action include Emacs, MATLAB, Mozilla and Opera.  It may have been inspired by the humorous Law of Software Development and Envelopment at MIT, which was posted on Usenet in 1989 by Greg Kuperberg, who wrote[: Every program in development at MIT expands until it can read mail.

Back to basic movement in system administration

Back to basic movement in system administration emerged in Unix world as a reaction on excessive  complexity and "Windows-style" bastardization of Linux, such as introduction of systemd in RHEL7 (see The tar pit of Red Hat overcomplexity).  It has stronger footing  in this environment as the classic style of performing sysadmin tasks is still competitive in many areas and did not lose its appeal, As Brian Kernighan noted (Nov 01, 2008 | IEEE Software, pp.18-19):

As I write this column, I'm in the middle of two summer projects; with luck, they'll both be finished by the time you read it.

... ... ...

Here has surely been much progress in tools over the 25 years that IEEE Software has been around, and I wouldn't want to go back in time. But the tools I use today are mostly the same old ones-grep, diff, sort, awk, and friends. This might well mean that I'm a dinosaur stuck in the past. On the other hand, when it comes to doing simple things quickly, I can often have the job done while experts are still waiting for their IDE to start up. Sometimes the old ways are best, and they're certainly worth knowing well

With the introduction of systemd, major Linux distributions such  as RHEL and Debian, by-and-large betrayed Unix philosophy in favor of getting more "mindshare" (and profits for Red Hat). The decision which resulted in overcomplexity and badly engineered, "foreign" in spirit subsystems.

Unix pioneered the usage of text files for system administration. In classic Unix all configuration files, and all log are text files and can be manipulated by text processing utilities, that was a great idea and still is.  Deviation from the idea that configuration files should be text files for Unix is a variation of the classic theme "road to hell is paved with good intentions" -- the intent might be noble but the results are not. Systemd complexity and vulnerabilities is one interesting example of movement of Linux in  this false direction.

As long as you stay with text files for configuration you can claim  to stay within  the realm of classic Unix administration. But,  of course, there is more -- classic Unix  administration can be performed using command line, although GUI can provide and does provide help is certain circumstances (few people now use text editor for developing programs -- GUI-based editors are clearly preferable). BTW GUI is possible in text mode too and in MS DOS this was pretty common mode of operation, which outlived MS DOS. Orthodox file managers (such as Midnight Commander)  popularity are a vivid demonstration of the fact that the idea of such interface, and specifically the idea of generating command line strings using GUI tools (the idea of orthodox interface), is a viable idea.

Overreliance on GUI tools lead to "click-click-click" style of administration and is anathema for professional Unix system administrators. 

In the beginning ... was command line

Unix shell environment is a unique, revolutionary at the time of creation of Unix development. Unix introduced the concept of pipes into shell scripting and even 50 years after the event it remains a tremendous achievement, which forever changed the way command line was used. Also the shell interpreter in Unix was not the part of kernel as in other OSes at this time, but a separate program, which created an opportunity for multiple shells and its enhancements independently of the development of the kernel. And solidified the view on shell as a member of a special class of programming languages -- a scripting language, or VHL (very high level) language.  See Scripting Languages as a Step in Evolution of Very high Level Languages

Unix command like provides ample opportunities for creativity for qualified users and sysadmin and they continue to use along and often instead of more complex and more attractive to novices "click-click-click" style GUI interfaces. The power of command line is enhanced by modern shells such as ksh93, bash and zsh, which exceed in  functionality their predecessors. 

One early manifest of the resilience of adherents of using command line tools among power users and system administrators was an essay "In the Beginning... Was the Command Line" by Neal Stephenson which was originally published online in 1999 and later made available in book form (November 1999, ISBN 978-0380815937).  A recurring theme of this essay was that the full power of the command line exceeds the power of GUI interfaces. And an open question remain whether those shortcomings are compensated by the fact that GUI is much easier to learn.  Probably this is true for accidental users. But for users who use this functionality on daily basis (system administrators) the advantages of command line are difficult to beat. It makes sense for them to learn the complexities of Unix command line and classic utilities and get in return more power and flexibility in their daily operations.

Another early contribution to the investigation of this theme was my essay GUI vs. Command line interface. It was written in the contest of enduring popularity of Orthodox File managers. The latter probably represent one of the oldest surviving interface. In the  page Orthodox File Managers I stated:

In a world obsessed with fancy GUI widgets and where look-and-feel of OS and applications change each three-five years, it's refreshing to see a minimalist interface that has the same look and feel for more then 30 years. And there are users of this product with more then 25 years experience (I am one of them, I have been using it since 1989 ;-)

Anybody involved in IT knows all too well that a quarter of a century in software is equal to eternity. Among system and application programs there are very few survivors which in some form preserved the world of unique 1980th-style character based interfaces. Among them we can mentions VI, THE editor, and a couple of other programs.

Several programs belonging to this type are descendants of Norton Commander, a file manager first released in 1986 by Norton Computing (since 1990 Norton Computing became part of Symantec). But not only file managers can have this type of interface. There is a distinct, but very similar trend in editors such as vi and THE, windows multiplexers (GNU screen), and minimalist windows managers (ratpoison). We can talk about Orthodox interface as a distinct type of interface different in concepts from traditional GUI interface used in Microsoft Windows and Apple operating systems and simultaneously different (and richer) then plain vanilla command line interface. See my article Less is More: A rich functionality behind Spartan interface of Orthodox File Managers for more information on the topic.

Orthodox file managers survived because behind Spartan appearance, they provided a very flexible interface as well as provided far richer functionality then alternatives (and while it's just accidental that one of popular OFMs is called FAR, we can claim that it was God's hand which guided the author to chose this particular name :-). In a way, OFM are extending the traditional Unix shell functionality in a new way creating a hybrid of shell and file manager, or a graphical shell if you wish.

In Linux administration usage of Orthodox file managers such as Midnight Commander can be viewed as a part "back to basic" movement -- the protest against overcomplexity and bloat that dominate RHEL since RHEL 7  as well as other Linux distributions when sysadmins is pushed into Windows -style management of the system using "click-click-click" GUI interfaces.

Another attraction is that due to stability of interface they belong to the unique class of programs usually called "Learn once, use for forever." That includes the ability to jump from one OFM manager to another with minimal pain. And they have an unmatched, really unmatched and completely unique in a world of idiosyncratic file managers portability (there is probably no platform for which at least one OFM does not exist; they are available on smartphones too :-) While originated in DOS and still more widely used in Windows world, OFMs really belong to Unix, sharing with Unix simplicity of design that hides extremely rich functionality, the elegance of key ideas (the idea of graphical shell, no more no less) and the prominent role that shells ( such as ksh and bash ) play in this environment (in OFM shell is exposed via command line, as well as in user menu and extension menu).

Recently Microsoft caught-up in shell area with the introduction of PowerShell, but still Windows world does not have a shell culture that exists in Unix world and used to exist in DOS world. That's probably why Midnight commander has the best implementation of user menu and extension menu among all prominent OFMs.

Now let's return to the article GUI vs Command line interface. It is clear that the ability to compose sequences of commands using command line interface can be to a certain extent combined with GUI which was done in Orthodox file managers, VIM and several system programs such as AIX System Management Interface Tool (SMIT). Here is what I wrote in this respect:

GUI with it's ability to display text using different fonts as well as to display graphic, provides  more capabilities than restrictive character-based interface. Nevertheless the character interface is really important and considered classic from another point of view: it is more programmable and more powerful than any GUI can ever be.  Also we need to distinguish that idea of GUI interface from most common implementation -- mouse-based check-drag-drop type of interface that is most often associated with  this term. It is just one possible implementation of GUI interface. Other implementations are possible as will be discussed below. 

While impressive for simple tasks like browsing and media consumption, mouse-based check-drag-drop interfaces are simply are not efficient for more complex and repetitive tasks. In such cases they involve lots of tedious repetitive clicking, dragging and dropping. In other word they are not programmable. They are almost optimal when you need to copy a single file, or selection of files  from one directory to another, but as operation became more complex they quickly lose luster.

It the same time GUI interface has important advantage: it provides much better "situational awareness" then pure (say, Unix shell style) command line interface.  Many classes of terrible errors committed by system administrators are related to what is called "loss of situational awareness". The latter is the ability to identify, process, and comprehend the critical elements of information about what is happening. Sysadmins should be alert to any clues which might indicate that you lost situational awareness and it is easier to lose it when working with command line interface then GUI. Performing operation in the wrong directory and deletion of wrong files (or, God forbid, system directory like /etc in Unix)  or making configuration changes on a wrong server are classic examples of loss of situational awareness. Many more can be listed ...

Many classes of terrible errors committed by system administrators are related to what is called "loss of situational awareness". The latter is the ability to identify, process, and comprehend the critical elements of information about what is happening (context of the situation).

But there is also another fundamental problem with any rich, visually attractive GUI interface other then its inability to accommodate complex tasks, like movement of files from different directories in one operation.  A primitive character based interface with fixed width fonts (often called "console interface") has interesting property, which is common for all minimalist interfaces. It stimulated creativity.   Artists know well the saying "form liberates". In the same way severe restriction of puritan character interface liberates a programmer from spending too much time and effort on unimportant things ("cosmetics") and thus provides the possibility to spend most of the time implementing a richer set of operations, more complex capabilities (regular expression, etc) or both. 

Most CLIs allow you to chain together commands (in Unix via pipes). Chaining is a powerful capability that is missing in check-drag-drop type of interface; it allows a systems administrator to extend functionality of  commands, scripts and entire applications in ways never intended by their designers.  There are whole categories of tasks that are child's play with the proper tool chain in a CLI that would be tedious (if not impossible) in a GUI.

Still there is no question about the fact that CLI interface is weaker then GUI in providing situational awareness about the Unix filesystem. Constant typing of ls commands by sysadmin who work with command line is a powerful confirmation of this fact ;-). While it is possible to improve this situation (see below) it by-and-large remains the same since 70th of previous century,  when filesystems were much smaller.

Case studies

There are several instance is which we can observe creation of group of power users and/or system administrators who adhere to the "back-to-basic" movement principles. Among them:

Devuan and "back to basics" movement

In 2014, a strong "back to basic" movement emerged among Debian users and  developers after the introduction of systemd, which resulted in creation of a fork called Devian (UNIX greybeards threaten Debian fork over systemd plan • The Register):

A group of “Veteran Unix Admins” reckons too much input from GNOME devs is dumbing down Debian, and in response, is floating the idea of a fork.

As the rebel greybeards put it, “... current leadership of the project is heavily influenced by GNOME developers and too much inclined to consider desktop needs as crucial to the project, despite the fact that the majority of Debian users are tech-savvy system administrators.”

The anonymous rebels' says “Some of us are upstream developers, some professional sysadmins: we are all concerned peers interacting with Debian and derivatives on a daily basis.” Their beef is that “We don't want to be forced to use systemd in substitution to the traditional UNIX sysvinit init, because systemd betrays the UNIX philosophy.”

“Debian today is haunted by the tendency to betray its own mandate, a base principle of the Free Software movement: put the user's rights first,” they write at debianfork.org. “What is happening now instead is that through a so called 'do-ocracy' developers and package maintainers are imposing their choices on users.”

Users of Midnight Commander and other orthodox file managers

And interesting subculture within the "back to basic" movement in system administration represent users of orthodox file managers, such as Midnight  Commander.

They try to adapt and enhance capabilities of Unix command line. In this sense Midnight commander and similar systems can be viewed as a graphical shell.

The community of users of command line version of vim editor

Another community of users the defy the lure of modern GUI interfaces and the community of VIM users. They view VIM as the quintessential Unix text editor. Vi command system isn't just popular; it's also a POSIX standard. It's an application every seasoned sysadmin knows, even if they don't intend to use it on an everyday basis. It's also a fast and simple editor, so once you get good at it, many users decide that this is the editor they have been long searching for.

Vim belong to the class of so called Orthodox editors and contains sophisticated internal system of commands and internal command line (which can be  used for execution of programs and scripts in shell too by prefixing  them with '!'). So this class of users  naturally fluctuates toward "back to basic" views on how to perform system administration  tasks.

Mike Gancarz book  The UNIX Philosophy  as the manifest of "back to basics" movement in IT

In 1994, Mike Gancarz (a member of the team that designed the X Window System), described lessons he drew on his own experience with Unix, as well as discussions with fellow programmers in his book The UNIX Philosophy (the second edition has a different title Linux and the Unix Philosophy ) It  sums it up Unix philosophy in nine simple precepts:

  1. Small is beautiful.
  2. Make each program do one thing well.
  3. Build a prototype as soon as possible.
  4. Choose portability over efficiency.
  5. Store data in flat text files.
  6. Use software leverage to your advantage.
  7. Use shell scripts to increase leverage and portability.
  8. Avoid captive user interfaces.
  9. Make every program a filter.

Here is one Amazon review:

Yong Zhi 4.0 out of 5 stars February 3, 2009

Everyone is on a learning curve

The author was a programmer before, so in writing this book, he draw both from his personal experience and his observation to depict the software world.
I think this is more of a practice and opinion book rather than "Philosophy" book, however I have to agree with him in most cases. For example, here is Mike Gancarz's line of thinking:

1. Hard to get the s/w design right at the first place, no matter who.
2. So it's better to write a short specs without considering all factors first.
3. Build a prototype to test the assumptions
4. Use an iterative test/rewrite process until you get it right
5. Conclusion: Unix evolved from a prototype.

In case you are curious, here are the 9 tenets of Unix/Linux:

1. Small is beautiful.
2. Make each program do one thing well.
3. Build a prototype as soon as possible.
4. Choose portability over efficiency.
5. Store data in flat text files.
6. Use software leverage to your advantage.
7. Use shell scripts to increase leverage and portability.
8. Avoid captive user interfaces.
9. Make every program a filter.

Mike Gancarz told a story like this when he argues "Good programmers write good code; great programmers borrow good code".

"I recall a less-than-top-notch software engineer who couldn't program his way out of a paper bag. He had a knack, however, for knitting lots of little modules together. He hardly ever wrote any of them himself, though. He would just fish around in the system's directories and source code repositories all day long, sniffing for routines he could string together to make a complete program. Heaven forbid that he should have to write any code.

Oddly enough, it wasn't long before management recognized him as an outstanding software engineer, someone who could deliver projects on time and within budget. Most of his peers never realized that he had difficulty writing even a rudimentary sort routine. Nevertheless, he became enormously successful by simply using whatever resources were available to him."

If this is not clear enough, Mike also drew analogies between Mick Jagger and Keith Richards and Elvis. The book is full of inspiring stories to reveal software engineers' tendencies and to correct their mindsets.

The tenets of Unix philosophy in sysadmin domain

Unix philosophy is described in Wikipedia as following:

The Unix philosophy, originated by Ken Thompson, is a set of cultural norms and philosophical approaches to minimalistmodular software development. It is based on the experience of leading developers of the Unix operating system. Early Unix developers were important in bringing the concepts of modularity and reusability into software engineering practice, spawning a "software tools" movement. Over time, the leading developers of Unix (and programs that ran on it) established a set of cultural norms for developing software; these norms became as important and influential as the technology of Unix itself; this has been termed the "Unix philosophy."

The Unix philosophy emphasizes building simple, short, clear, modular, and extensible code that can be easily maintained and repurposed by developers other than its creators. The Unix philosophy favors composability as opposed to monolithic design.

... ... ...

The Unix philosophy is documented by Doug McIlroy[1] in the Bell System Technical Journal from 1978:[2]

  1. Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".
  2. Expect the output of every program to become the input to another, as yet unknown, program. Don't clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don't insist on interactive input.
  3. Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them.
  4. Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you've finished using them.

It was later summarized by Peter H. Salus in A Quarter-Century of Unix (1994):[1]

In their award-winning[citation needed] Unix paper of 1974, Ritchie and Thompson quote the following design considerations:[3]

 


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Jun 07, 2021] The overall overcomplexity in our civilization

Notable quotes:
"... Main drivers of this overcomplexity are bloated states and economy dominated by corporations. Both states and corporations have IT systems today "and the complexity of those IT systems has to reflect the complexity of organisms and processes they try to cover. " ..."
pragmaticleader.net

by andy , under Uncategorized

Someone has sent me a link to a quite emotional but interesting article by Tim Bray on why the world of enterprise systems delivers so many failed projects and sucky software while the world of web startups excels at producing great software fast.

Tim makes some very valid points about technology, culture and approach to running projects. It is true that huge upfront specs, fixed bid contracts and overall waterfall approach are indeed culprits behind most failed IT projects, and that agile, XP and other key trends of recent years can help.

However, I don't think they can really cure the problem, because we are facing a deeper issue here: the overall overcomplexity in our civilization.

Main drivers of this overcomplexity are bloated states and economy dominated by corporations. Both states and corporations have IT systems today "and the complexity of those IT systems has to reflect the complexity of organisms and processes they try to cover. "

The IT system for a national health care system or a state run compulsory social security "insurance" is a very good example. It must be a complex mess because what it is trying to model and run is a complex, overbloated mess "" in most cases a constantly changing mess. And it can't be launched early because it is useless unless it covers the whole scope of what it is supposed to do: because most of what it covers is regulations and laws you can't deliver a system that meets half of the regulations or 10% "" it can't be used. By the very nature of the domain the system has to be launched as a finished whole.

Plus, on top of all that, comes the scale. If you can imagine a completely privatized health care no system will ever cover all citizens "" each doctor, hospital, insurer etc. will cover just its clients, a subset of the population. A system like NHS has to handle all of the UK's population by design.

Same problem with corporations, especially those that have been around for long (by long I mean decades, not years): scale and mentality. You just can't manage 75 thousand people easily, especially if they are spread around the globe, in a simple and agile way.

Just think of all accounting requirements global corporations have to handle with their IT systems "" but this is just the tip of the iceberg. Whole world economy floats in a sea of legislation "" legislative diarrhea of the last decades produced a legal swamp which is a nightmare to understand let alone model a system to comply with it. For a global corporation multiply that by all the countries it is in and stick some international regulations on top of this. This is something corporate systems have to cope with.

What is also important "" much of that overcomplexity is computer driven: it would not have been possible if not for the existence of IT systems and computers that run them.

Take VAT tax "" it is so complex I always wonder what idiots gave the Nobel prize to the moron who invented it (well, I used to wonder about that when Nobel prize had any credibility). Clearly, implementing it is completely impossible without computers & systems everywhere.

Same about the legal diarrhea I mentioned "" I think it can be largely attributed to Microsoft Word. Ever wondered why the EU Constitution (now disguised as "Lisbon Treaty") has hundreds of pages while the US Constitution is simple and elegant? Well, they couldn't have possibly written a couple hundred page document with a quill pen which forced them to produce something concise.

But going back to the key issue of whether the corporate IT systems can be better: they can, but a deeper shift in thinking is needed. Instead of creating huge, complex systems corporate IT should rather be a cloud of simple, small systems built and maintained to provide just one simple service (exactly what web startups are doing "" each of them provides simple a service, together they create a complex ecosystem). However, this shift would have to occur on the organizational level too "" large organizations with complex rules should be replaced with small, focused entities with simple rules for interaction between them.

But to get there we would need a world-wide "agile adoption" reaching well beyond IT. But that means a huge political change, that is nowhere on the horizon. Unless, of course, one other enabler of our civilization's overcomplexity fades: cheap, abundant energy.

[Jun 06, 2021] 5 Types of Over-Complexity by John Downey

Aug 18, 2018 | naimonet.com

Over-complexity describes a tangible or intangible entity that is more complex than it needs to be relative to its use and purpose. Complexity can be measured as the amount of information that is required to fully document an entity. A technology that can be fully described in 500 words is far less complex than a technology that requires at least 5 million words to fully specify. The following are common types of over-complexity.

Accidental Complexity Accidental complexity is any complexity beyond the minimum required to meet a need. This can be compared to essential complexity that describes the most simple solution possible for a given need and level of quality. For example, the essential complexity for a bridge that is earthquake resistance and inexpensive to maintain might be contained in an architectural design of 15 pages. If a competing design were to be 100 pages with the same level of quality and functionality, this design can be considered overly complex.

Overthinking A decision making process that is overly complex such that it is an inefficient use of time and other resources. Overthinking can also result in missed opportunities. For example, a student who spends three years thinking about what afterschool activity they would like to join instead of just trying a few things to see how they work out. By the time the student finally makes a decision to join a soccer team, they find the other players are far more advanced than themselves.

Gold Plating Adding additional functions, features and quality to something that adds little or no value. For example, a designer of an air conditioning unit who adds personalized settings for up to six individuals to the user interface. This requires people to install an app to use the air conditioner such that users typically view the feature as an annoyance. The feature is seldom used and some customers actively avoid the product based on reviews that criticise the feature. The feature also adds to the development cost and unit cost of the product, making it less competitive in the market.

Big Ball of Mud A big ball of mud is a design that is the product of many incremental changes that aren't coordinated within a common architecture and design. A common example is a city that emerges without any building regulations or urban planning. Big ball of mud is also common in software where developers reinvent the same services such that code becomes extremely complex relative to its use.

Incomprehensible Communication Communication complexity is measured by how long it takes you to achieve your communication objectives with an audience. It is common for communication to be overly indirect with language that is unfamiliar to an audience such that little gets communicated. Communication complexity is also influenced by how interesting the audience find your speech, text or visualization. For example, an academic who uses needlessly complex speech out of a sense of elitism or fear of being criticized may transfer little knowledge to students with a lecture such that it can be viewed as overly complex.

Notes Over-complexity can have value to quality of life and culture. If the world was nothing but minimized, plain functionality it would be less interesting.

[Jun 06, 2021] Lasagna Code by lispian

Notable quotes:
"... Lasagna Code is layer upon layer of abstractions, objects and other meaningless misdirections that result in bloated, hard to maintain code all in the name of "clarity". ..."
"... Turbo Pascal v3 was less than 40k. That's right, 40 thousand bytes. Try to get anything useful today in that small a footprint. Most people can't even compile "Hello World" in less than a few megabytes courtesy of our object-oriented obsessed programming styles which seem to demand "lines of code" over clarity and "abstractions and objects" over simplicity and elegance. ..."
Jan 01, 2011 | www.pixelstech.net

Anyone who claims to be even remotely versed in computer science knows what "spaghetti code" is. That type of code still sadly exists. But today we also have, for lack of a better term" and sticking to the pasta metaphor" "lasagna code".

Lasagna Code is layer upon layer of abstractions, objects and other meaningless misdirections that result in bloated, hard to maintain code all in the name of "clarity". It drives me nuts to see how badly some code today is. And then you come across how small Turbo Pascal v3 was , and after comprehending it was a full-blown Pascal compiler, one wonders why applications and compilers today are all so massive.

Turbo Pascal v3 was less than 40k. That's right, 40 thousand bytes. Try to get anything useful today in that small a footprint. Most people can't even compile "Hello World" in less than a few megabytes courtesy of our object-oriented obsessed programming styles which seem to demand "lines of code" over clarity and "abstractions and objects" over simplicity and elegance.

Back when I was starting out in computer science I thought by today we'd be writing a few lines of code to accomplish much. Instead, we write hundreds of thousands of lines of code to accomplish little. It's so sad it's enough to make one cry, or just throw your hands in the air in disgust and walk away.

There are bright spots. There are people out there that code small and beautifully. But they're becoming rarer, especially when someone who seemed to have thrived on writing elegant, small, beautiful code recently passed away. Dennis Ritchie understood you could write small programs that did a lot. He comprehended that the algorithm is at the core of what you're trying to accomplish. Create something beautiful and well thought out and people will examine it forever, such as Thompson's version of Regular Expressions !

... ... ...

Source: http://lispian.net/2011/11/01/lasagna-code/

[Jun 06, 2021] Software and the war against complexity

Notable quotes:
"... Stephen Hawking predicted this would be " the century of complexity ." He was talking about theoretical physics, but he was dead right about technology... ..."
"... Any human mind can only encompass so much complexity before it gives up and starts making slashing oversimplifications with an accompanying risk of terrible mistakes. ..."
Jun 05, 2021 | techcrunch.com

...Stephen Hawking predicted this would be " the century of complexity ." He was talking about theoretical physics, but he was dead right about technology...

Let's try to define terms. How can we measure complexity? Seth Lloyd of MIT, in a paper which drily begins "The world has grown more complex recently, and the number of ways of measuring complexity has grown even faster," proposed three key categories: difficulty of description, difficulty of creation, and degree of organization. Using those three criteria, it seems apparent at a glance that both our societies and our technologies are far more complex than they ever have been, and rapidly growing even moreso.

The thing is, complexity is the enemy. Ask any engineer "¦ especially a security engineer. Ask the ghost of Steve Jobs. Adding complexity to solve a problem may bring a short-term benefit, but it invariably comes with an ever-accumulating long-term cost. Any human mind can only encompass so much complexity before it gives up and starts making slashing oversimplifications with an accompanying risk of terrible mistakes.

You may have noted that those human minds empowered to make major decisions are often those least suited to grappling with nuanced complexity. This itself is arguably a lingering effect of growing complexity. Even the simple concept of democracy has grown highly complex" party registration, primaries, fundraising, misinformation, gerrymandering, voter rolls, hanging chads, voting machines" and mapping a single vote for a representative to dozens if not hundreds of complex issues is impossible, even if you're willing to consider all those issues in depth, which most people aren't.

Complexity theory is a rich field, but it's unclear how it can help with ordinary people trying to make sense of their world. In practice, people deal with complexity by coming up with simplified models close enough to the complex reality to be workable. These models can be dangerous" "everyone just needs to learn to code," "software does the same thing every time it is run," "democracies are benevolent"" but they were useful enough to make fitful progress.

In software, we at least recognize this as a problem. We pay lip service to the glories of erasing code, of simplifying functions, of eliminating side effects and state, of deprecating complex APIs, of attempting to scythe back the growing thickets of complexity. We call complexity "technical debt" and realize that at least in principle it needs to be paid down someday.

"Globalization should be conceptualized as a series of adapting and co-evolving global systems, each characterized by unpredictability, irreversibility and co-evolution. Such systems lack finalized "˜equilibrium' or "˜order'; and the many pools of order heighten overall disorder," to quote the late John Urry. Interestingly, software could be viewed that way as well, interpreting, say, "the Internet" and "browsers" and "operating systems" and "machine learning" as global software systems.

Software is also something of a best possible case for making complex things simpler. It is rapidly distributed worldwide. It is relatively devoid of emotional or political axegrinding. (I know, I know. I said "relatively.") There are reasonably objective measures of performance and simplicity. And we're all at least theoretically incentivized to simplify it.

So if we can make software simpler" both its tools and dependencies, and its actual end products" then that suggests we have at least some hope of keeping the world simple enough such that crude mental models will continue to be vaguely useful. Conversely, if we can't, then it seems likely that our reality will just keep growing more complex and unpredictable, and we will increasingly live in a world of whole flocks of black swans. I'm not sure whether to be optimistic or not. My mental model, it seems, is failing me.

[Jun 06, 2021] Software Complexity Is Killing Us by Justin Etheredge

Jan 29, 2018 | www.simplethread.com

SOFTWARE DESIGN

Since the dawn of time (before software, there was only darkness), there has been one constant: businesses want to build software cheaper and faster.

It is certainly an understandable and laudable goal especially if you've spent any time around software developers. It is a goal that every engineer should support wholeheartedly, and we should always strive to create things as efficiently as possible, given the constraints of our situation.

However, the truth is we often don't. It's not intentional, but over time, we get waylaid by unforeseen complexities in building software and train ourselves to seek out edge cases, analysis gaps, all of the hidden repercussions that can result from a single bullet point of requirements.

We get enthralled by the maelstrom of complexity and the mental puzzle of engineering elegant solutions: Another layer of abstraction! DRY it up! Separate the concerns! Composition over inheritance! This too is understandable, but in the process, we often lose sight of the business problems being solved and forget that managing complexity is the second most important responsibility of software developers.

So how did we get here?

Software has become easier in certain ways.

Over the last few decades, our industry has been very successful at reducing the amount of custom code it takes to write most software.

Much of this reduction has been accomplished by making programming languages more expressive. Languages such as Python, Ruby, or JavaScript can take as little as one third as much code as C in order to implement similar functionality. C gave us similar advantages over writing in assembler. Looking forward to the future, it is unlikely that language design will give us the same kinds of improvements we have seen over the last few decades.

But reducing the amount of code it takes to build software involves many other avenues that don't require making languages more expressive. By far the biggest gain we have made in this over the last two decades is open source software (OSS). Without individuals and companies pouring money into software that they give freely to the community, much of what we build today wouldn't be possible without an order of magnitude more cost and effort.

These projects have allowed us to tackle problems by standing on the shoulders of giants, leveraging tools to allow us to focus more of our energy on actually solving business problems, rather than spending time building infrastructure.

That said, businesses are complex. Ridiculously complex and only getting moreso. OSS is great for producing frameworks and tools that we can use to build systems on top of, but for the most part, OSS has to tackle problems shared by a large number of people in order to gain traction. Because of that, most open source projects have to either be relatively generic or be in a very popular niche. Therefore, most of these tools are great platforms on which to build out systems, but at the end of the day, we are still left to build all of the business logic and interfaces in our increasingly complex and demanding systems.

So what we are left with is a stack that looks something like this (for a web application)"¦

<Our Code>
<Libraries>
<Web Framework>
<Web Server>
<Data Stores>
<Operating System>

That "Our Code" part ends up being enormously complex, since it mirrors the business and its processes. If we have custom business logic, and custom processes, then we are left to build the interfaces, workflow, and logic that make up our applications. Sure, we can try to find different ways of recording that logic (remember business rules engines?), but at the end of the day, no one else is going to write the business logic for your business. There really doesn't seem to be a way around that"¦ at least not until the robots come and save us all from having to do any work.

Don't like code, well how about Low-Code?

So if we have to develop the interfaces, workflow, and logic that make up our applications, then it sounds like we are stuck, right? To a certain extent, yes, but we have a few options.

To most developers, software equals code, but that isn't reality. There are many ways to build software, and one of those ways is through using visual tools. Before the web, visual development and RAD tools had a much bigger place in the market. Tools like PowerBuilder, Visual Foxpro, Delphi, VB, and Access all had visual design capabilities that allowed developers to create interfaces without typing out any code.

These tools spanned the spectrum in terms of the amount of code you needed to write, but in general, you designed your app visually and then ended up writing a ton of code to implement the logic of your app. In many cases you still ended up programmatically manipulating the interface, since interfaces built using these tools often ended up being very static. However, for a huge class of applications, these tools allowed enormous productivity gains over the alternatives, mostly at the cost of flexibility.

The prevalence of these tools might have waned since the web took over, but companies' desire for them has not, especially since the inexorable march of software demand continues. The latest trend that is blowing across the industry is "low code" systems. Low code development tools are a modern term put on the latest generation of drag and drop software development tools. The biggest difference between these tools and their brethren from years past is that they are now mostly web (and mobile) based and are often hosted platforms in the cloud.

And many companies are jumping all over these platforms. Vendors like Salesforce (App Cloud), Outsystems, Mendix, or Kony are promising the ability to create applications many times faster than "traditional" application development. While many of their claims are probably hyperbole, there likely is a bit of truth to them as well. For all of the downsides of depending on platforms like these, they probably do result in certain types of applications being built faster than traditional enterprise projects using .NET or Java.

So, what is the problem?

Well, a few things. First is that experienced developers often hate these tools. Most Serious Developersâ„¢ like to write Real Softwareâ„¢ with Real Codeâ„¢. I know that might sound like I'm pandering to a bunch of whiney babies (and maybe I am a bit), but if the core value you deliver is technology, it is rarely a good idea to adopt tools that your best developers don't want to work with.

Second is that folks like me look at these walled platforms and say "nope, not building my application in there." That is a legitimate concern and the one that bothers me the most.

If you built an application a decade ago with PHP, then that application might be showing its age, but it could still be humming along right now just fine. The language and ecosystem are open source, and maintained by the community. You'll need to keep your application up to date, but you won't have to worry about a vendor deciding it isn't worth their time to support you anymore.

"¦folks like me look at these walled platforms and say "nope, not building my application in there." That is a legitimate concern and the one that bothers me the most.

If you picked a vendor 10 years ago who had a locked down platform, then you might be forced into a rewrite if they shut down or change their tooling too much ( remember Parse? ). Or even worse, your system gets stuck on a platforms that freezes and no longer serves your needs.

There are many reasons to be wary of these types of platforms, but for many businesses, the allure of creating software with less effort is just too much to pass up. The complexity of software continues on, and software engineers unfortunately aren't doing ourselves any favors here.

What needs to change?

There are productive platforms out there, that allow us to build Real Softwareâ„¢ with Real Codeâ„¢, but unfortunately our industry right now is far too worried with following the lead of the big tech giants to realize that sometimes their tools don't add a lot of value to our projects.

I can't tell you the number of times I've had a developer tell me that building something as a single page application (SPA) adds no overhead versus just rendering HTML. I've heard developers say that every application should be written on top of a NoSQL datastore, and that relational databases are dead. I've heard developers question why every application isn't written using CQRS and Event Sourcing.

It is that kind of thought process and default overhead that is leading companies to conclude that software development is just too expensive. You might say, "But event sourcing is so elegant! Having a SPA on top of microservices is so clean!" Sure, it can be, but not when you're the person writing all ten microservices. It is that kind of additional complexity that is often so unnecessary .

We, as an industry, need to find ways to simplify the process of building software, without ignoring the legitimate complexities of businesses. We need to admit that not every application out there needs the same level of interface sophistication and operational scalability as Gmail. There is a whole world of apps out there that need well thought-out interfaces, complicated logic, solid architectures, smooth workflows, etc"¦. but don't need microservices or AI or chatbots or NoSQL or Redux or Kafka or Containers or whatever the tool dujour is.

A lot of developers right now seem to be so obsessed with the technical wizardry of it all that they can't step back and ask themselves if any of this is really needed.

It is like the person on MasterChef who comes in and sells themselves as the molecular gastronomist. They separate ingredients into their constituent parts, use scientific methods of pairing flavors, and then apply copious amounts of CO2 and liquid nitrogen to produce the most creative foods you've ever seen. And then they get kicked off after an episode or two because they forget the core tenet of most cooking, that food needs to taste good. They seem genuinely surprised that no one liked their fermented fennel and mango-essence pearls served over cod with anchovy foam.

Our obsession with flexibility, composability, and cleverness is causing us a lot of pain and pushing companies away from the platforms and tools that we love. I'm not saying those tools I listed above don't add value somewhere; they arose in response to real pain points, albeit typically problems encountered by large companies operating systems at enormous scale.

What I'm saying is that we need to head back in the direction of simplicity and start actually creating things in a simpler way, instead of just constantly talking about simplicity. Maybe we can lean on more integrated tech stacks to provide out of the box patterns and tools to allow software developers to create software more efficiently.

"¦we are going to push more and more businesses into the arms of "low code" platforms and other tools that promise to reduce the cost of software by dumbing it down and removing the parts that brought us to it in the first place.

We need to stop pretending that our 20th line-of-business application is some unique tapestry that needs to be carefully hand-sewn.

Staying Focused on Simplicity

After writing that, I can already hear a million developers sharpening their pitchforks, but I believe that if we keep pushing in the direction of wanting to write everything, configure everything, compose everything, use the same stack for every scale of problem, then we are going to push more and more businesses into the arms of "low code" platforms and other tools that promise to reduce the cost of software by dumbing it down and removing the parts that brought us to it in the first place.

Our answer to the growing complexity of doing business cannot be adding complexity to the development process "" no matter how elegant it may seem.

We must find ways to manage complexity by simplifying the development process. Because even though managing complexity is our second most important responsibility, we must always remember the most important responsibility of software developers: delivering value through working software.


[Jun 06, 2021] Software Engineering: the war against complexity by Jurgen J. Vinju

Feb 24, 2015 | homepages.cwi.nl

Common situations are: " lack of control leading to unbounded growth " lack of predictability, leading to unbounded cost " lack of long term perspective, leading to ill-informed decisions

complex software is the enemy of quality

Complicated = many interrelated parts " linear: small change = small impact " predictable: straight flow, local failure " decomposable: manageable

Complex = unpredictable & hard to manage " emergent: whole is more than sum " non-linear: small change = big impact? " cascading failure " hysteresis: you must understand its history " indivisible

" Refactoring is improving internal quality " reducing complexity " without changing functionality.

[Jun 06, 2021] Reducing Complexity

Notable quotes:
"... Overcomplexity is when a system, organization, structure or process is unnecessarily difficult to analyze, solve or make sense of. ..."
Jun 02, 2021 | www.tipt.com

In the pharmaceutical industry, accuracy and attention to detail are important. Focusing on these things is easier with simplicity, yet in the pharmaceutical industry overcomplexity is common, which can lead to important details getting overlooked. However, many companies are trying to address this issue.

In fact, 76% of pharmaceutical execs believe that reducing complexity leads to sustainable cost reductions. Read on for some of the ways that overcomplexity harms pharmaceutical companies and what is being done to remedy it.

1. What Students in Pharmaceutical Manufacturing Training Should Know About Overcomplexity's Origins

Overcomplexity is when a system, organization, structure or process is unnecessarily difficult to analyze, solve or make sense of. In pharmaceutical companies, this is a major issue and hindrance to the industry as a whole. Often, overcomplexity is the byproduct of innovation and progress, which, despite their obvious advantages, can lead to an organization developing too many moving parts.

For example, new forms of collaboration as well as scientific innovation can cause overcomplexity because any time something is added to a process, it becomes more complex. Increasing regulatory scrutiny can also add complexity, as this feedback can focus on symptoms rather than the root of an issue.

2. Organizational Overhead Can Lead to Too Much Complexity

Organizational complexity occurs when too many personnel are added, in particular department heads. After pharmaceutical manufacturing training you will work on teams that can benefit from being lean. Increasing overhead is often done to improve data integrity. For example, if a company notices an issue with data integrity, they often create new roles for overseeing data governance.

Any time personnel are added for oversight, there is a risk of increased complexity at shop floor level. Fortunately, some companies are realizing that the best way to deal with issues of data integrity is by improving data handling within departments themselves, rather than adding new layers of overhead""and complexity.

3. Quality Systems Can Create a Backlog

A number of pharmaceutical sites suffer from a backlog of Corrective and Preventive Actions (CAPAs). CAPAs are in place to improve conformities and quality and they follow the Good Manufacturing Practices you know about from pharmaceutical manufacturing courses . However, many of these sit open until there are too many of them to catch up.

Backlog that is close to 10 percent of the total number of investigations per year points to a serious issue with the company's system. Some companies are dealing with this backlog by introducing a risk-based, triaged approach. Triaging allows companies to focus on the most urgent deviations and CAPAs, thus reducing this key issue of overcomplexity in the pharmaceutical industry.

4. Pharmaceutical Manufacturing Diploma Grads Should Know What Can Help

Some strategies are being adapted to address the root problems of overcomplexity. Radical simplification, for example, is a way to target what is fundamentally wrong with overly complex organizations and structures. This is a method of continuously improving data and performance that focuses on improving processes.

Cognitive load deduction is another way to reduce complexity, which looks at forms and documents and attempts to reduce the effort used when working with them. In reducing the effort required to perform tasks and fill out forms, more can be accomplished by a team.

Finally, auditors can help reduce complexity by assessing the health of a company's quality systems, such as assessing how many open CAPAs exist. Understanding these different solutions to overcomplexity could help you excel in your career after your courses.

Are you interested in getting your pharmaceutical manufacturing diploma ?

[Jun 02, 2021] Linux and the Unix Philosophy by Gancarz, Mike

Jun 02, 2021 | www.amazon.com


Yong Zhi

Everyone is on a learning curve

4.0 out of 5 stars Everyone is on a learning curve Reviewed in the United States on February 3, 2009 The author was a programmer before, so in writing this book, he draw both from his personal experience and his observation to depict the software world.

I think this is more of a practice and opinion book rather than "Philosophy" book, however I have to agree with him in most cases.

For example, here is Mike Gancarz's line of thinking:

1. Hard to get the s/w design right at the first place, no matter who.
2. So it's better to write a short specs without considering all factors first.
3. Build a prototype to test the assumptions
4. Use an iterative test/rewrite process until you get it right
5. Conclusion: Unix evolved from a prototype.

In case you are curious, here are the 9 tenets of Unix/Linux:

1. Small is beautiful.
2. Make each program do one thing well.
3. Build a prototype as soon as possible.
4. Choose portability over efficiency.
5. Store data in flat text files.
6. Use software leverage to your advantage.
7. Use shell scripts to increase leverage and portability.
8. Avoid captive user interfaces.
9. Make every program a filter.

Mike Gancarz told a story like this when he argues "Good programmers write good code; great programmers borrow good code".

"I recall a less-than-top-notch software engineer who couldn't program his way out of a paper bag. He had a knack, however, for knitting lots of little modules together. He hardly ever wrote any of them himself, though. He would just fish around in the system's directories and source code repositories all day long, sniffing for routines he could string together to make a complete program. Heaven forbid that he should have to write any code. Oddly enough, it wasn't long before management recognized him as an outstanding software engineer, someone who could deliver projects on time and within budget. Most of his peers never realized that he had difficulty writing even a rudimentary sort routine. Nevertheless, he became enormously successful by simply using whatever resources were available to him."

If this is not clear enough, Mike also drew analogies between Mick Jagger and Keith Richards and Elvis. The book is full of inspiring stories to reveal software engineers' tendencies and to correct their mindsets.

[Jun 02, 2021] The Poetterisation of GNU-Linux

10, 2013 | www.slated.org

I've found a disturbing trend in GNU/Linux, where largely unaccountable cliques of developers unilaterally decide to make fundamental changes to the way it works, based on highly subjective and arrogant assumptions, then forge ahead with little regard to those who actually use the software, much less the well-established principles upon which that OS was originally built. The long litany of examples includes Ubuntu Unity , Gnome Shell , KDE 4 , the /usr partition , SELinux , PolicyKit , Systemd , udev and PulseAudio , to name a few.

I hereby dub this phenomenon the " Poetterisation of GNU/Linux ".

The broken features, creeping bloat, and in particular the unhealthy tendency toward more monolithic, less modular code in certain Free Software projects, is a very serious problem, and I have a very serous opposition to it. I abandoned Windows to get away from that sort of nonsense, I didn't expect to have to deal with it in GNU/Linux.

Clearly this situation is untenable.

The motivation for these arbitrary changes mostly seems to be rooted in the misguided concept of "popularity", which makes no sense at all for something that's purely academic and non-commercial in nature. More users does not equal more developers. Indeed more developers does not even necessarily equal more or faster progress. What's needed is more of the right sort of developers, or at least more of the existing developers to adopt the right methods.

This is the problem with distros like Ubuntu, as the most archetypal example. Shuttleworth pushed hard to attract more users, with heavy marketing and by making Ubuntu easy at all costs, but in so doing all he did was amass a huge burden, in the form of a large influx of users who were, by and large, purely consumers, not contributors.

As a result, many of those now using GNU/Linux are really just typical Microsoft or Apple consumers, with all the baggage that entails. They're certainly not assets of any kind. They have expectations forged in a world of proprietary licensing and commercially-motivated, consumer-oriented, Hollywood-style indoctrination, not academia. This is clearly evidenced by their belligerently hostile attitudes toward the GPL, FSF, GNU and Stallman himself, along with their utter contempt for security and other well-established UNIX paradigms, and their unhealthy predilection for proprietary software, meaningless aesthetics and hype.

Reading the Ubuntu forums is an exercise in courting abject despair, as one witnesses an ignorant hoard demand GNU/Linux be mutated into the bastard son of Windows and Mac OS X. And Shuttleworth, it seems, is only too happy to oblige , eagerly assisted by his counterparts on other distros and upstream projects, such as Lennart Poettering and Richard Hughes, the former of whom has somehow convinced every distro to mutate the Linux startup process into a hideous monolithic blob , and the latter of whom successfully managed to undermine 40 years of UNIX security in a single stroke, by obliterating the principle that unprivileged users should not be allowed to install software system-wide.

GNU/Linux does not need such people, indeed it needs to get rid of them as a matter of extreme urgency. This is especially true when those people are former (or even current) Windows programmers, because they not only bring with them their indoctrinated expectations, misguided ideologies and flawed methods, but worse still they actually implement them , thus destroying GNU/Linux from within.

Perhaps the most startling example of this was the Mono and Moonlight projects, which not only burdened GNU/Linux with all sorts of "IP" baggage, but instigated a sort of invasion of Microsoft "evangelists" and programmers, like a Trojan horse, who subsequently set about stuffing GNU/Linux with as much bloated, patent encumbered garbage as they could muster.

I was part of a group who campaigned relentlessly for years to oust these vermin and undermine support for Mono and Moonlight, and we were largely successful. Some have even suggested that my diatribes , articles and debates (with Miguel de Icaza and others) were instrumental in securing this victory, so clearly my efforts were not in vain.

Amassing a large user-base is a highly misguided aspiration for a purely academic field like Free Software. It really only makes sense if you're a commercial enterprise trying to make as much money as possible. The concept of "market share" is meaningless for something that's free (in the commercial sense).

Of course Canonical is also a commercial enterprise, but it has yet to break even, and all its income is derived through support contracts and affiliate deals, none of which depends on having a large number of Ubuntu users (the Ubuntu One service is cross-platform, for example).

What GNU/Linux needs is a small number of competent developers producing software to a high technical standard, who respect the well-established UNIX principles of security , efficiency , code correctness , logical semantics , structured programming , modularity , flexibility and engineering simplicity (a.k.a. the KISS Principle ), just as any scientist or engineer in the field of computer science and software engineering should .

What it doesn't need is people who shrug their shoulders and bleat " disks are cheap ".

[Jun 02, 2021] The Linux Philosophy for SysAdmins- And Everyone Who Wants To Be One, by Both, David

Notable quotes:
"... The author instincts on sysadmin related issues are mostly right: he is suspicious about systemd and another perversions in modern Linuxes, he argues for simplicity in software, and he warns us about PHBs problem in IT departments, points out for the importance of documentation. etc. ..."
"... maybe it is the set of topics that the author discusses is the main value of the book. ..."
"... in many cases, the right solution is to avoid those subsystems or software packages like the plague and use something simpler. Recently, avoiding Linux flavors with systemd also can qualify as a solution ;-) ..."
"... For example, among others, the author references a rare and underappreciated, but a very important book "Putt's Law and the Successful Technocrat: How to Win in the Information Age by Archibald Putt (2006-04-28)". From which famous Putt's Law "Technology is dominated by two types of people, those who understand what they do not manage and those who manage what they do not understand," and Putt's Corollary: "Every technical hierarchy, in time, develop a competence inversion" were originated. This reference alone is probably worth half-price of the book for sysadmins, who never heard about Putt's Law. ..."
"... Linux (as of monstrous RHEL 7 with systemd, network manager and other perversions, which raised the complexity of the OS at least twice) became a way to complex for a human brain. It is impossible to remember all the important details and lessons learned from Internet browsing, your SNAFU and important tickets. Unless converted into private knowledgebase, most of such valuable knowledge disappears, say, in six months or so. And the idea of using corporate helpdesk as a knowledge database is in most cases a joke. ..."
Nov 02, 2018 | www.amazon.com

skeptic Reviewed in the United States on November 2, 2018 5.0 out of 5 stars

Some valuable tips. Can serve as fuel for your own thoughts.

This book is most interesting probably for people who can definitely do well without it – seasoned sysadmins and educators.

Please ignore the word "philosophy" in the title. Most sysadmins do not want to deal with "philosophy";-). And this book does not rise to the level of philosophy in any case. It is just collection of valuable (and not so valuable) tips from the author career as a sysadmin of a small lab, thinly dispersed in 500 pages. Each chapter can serve as a fuel for your own thoughts. The author instincts on sysadmin related issues are mostly right: he is suspicious about systemd and another perversions in modern Linuxes, he argues for simplicity in software, and he warns us about PHBs problem in IT departments, points out for the importance of documentation. etc.

In some cases, I disagreed with the author, or view his treatment of the topic as somewhat superficial, but still, his points created the kind of "virtual discussion" that has a value of its own. And maybe it is the set of topics that the author discusses is the main value of the book.

I would classify this book as "tips" book when the author shares his approach to this or that problem (sometimes IMHO wrong, but still interesting ;-), distinct from the more numerous and often boring, but much better-selling class of "how to" books. The latter explains in gory details how to deal with a particular complex Unix/Linux subsystem, or a particular role (for example system administrator of Linux servers). But in many cases, the right solution is to avoid those subsystems or software packages like the plague and use something simpler. Recently, avoiding Linux flavors with systemd also can qualify as a solution ;-)

This book is different. It is mostly about how to approach some typical system tasks, which arise on the level of a small lab (that the lab is small is clear from the coverage of backups). The author advances an important idea of experimentation as a way of solving the problem and optimizing your existing setup and work habits.

The book contains an overview of good practices of using some essential sysadmin tools such as screen and sudo. In the last chapter, the author even briefly mentions (just mentions) a very important social problem -- the problem micromanagers. The latter is real cancer in Unix departments of large corporations (and not only in Unix departments)

All chapters contain "webliography" at the end adding to the value of the book. While Kindle version of the book is badly formatted for PC (but is OK on Samsung 10" tablet; I would recommend to this it for reading instead), the references in Kindle version are clickable. And reading them them along with reading the book, including the author articles at opensource.com enhance the book value greatly.

For example, among others, the author references a rare and underappreciated, but a very important book "Putt's Law and the Successful Technocrat: How to Win in the Information Age by Archibald Putt (2006-04-28)". From which famous Putt's Law "Technology is dominated by two types of people, those who understand what they do not manage and those who manage what they do not understand," and Putt's Corollary: "Every technical hierarchy, in time, develop a competence inversion" were originated. This reference alone is probably worth half-price of the book for sysadmins, who never heard about Putt's Law.

Seasoned sysadmins can probably just skim Part I-III (IMHO those chapters are somewhat simplistic. ) For example, you can skip Introduction to author's Linux philosophy, his views on contribution to open source, and similar chapters that contain trivial information ). I would start reading the book from Part IV (Becoming Zen ), which consist of almost a dozen interesting topics. Each of them is covered very briefly (which is a drawback). But they can serve as starters for your own thought process and own research. The selection of topics is very good and IMHO constitutes the main value of the book.

For example, the author raises a very important issue in his chapter 20: Document Everything, but unfortunately, this chapter is too brief, and he does not address the most important thing: sysadmin should work on some way to organize your personal knowledge. For example as a private website. Maintenances of such a private knowledgebase is a crucial instrument of any Linux sysadmin worth his/her salary and part of daily tasks worth probably 10% of sysadmin time. The quote "Those who cannot learn from history are doomed to repeat it" has a very menacing meaning in sysadmin world.

Linux (as of monstrous RHEL 7 with systemd, network manager and other perversions, which raised the complexity of the OS at least twice) became a way to complex for a human brain. It is impossible to remember all the important details and lessons learned from Internet browsing, your SNAFU and important tickets. Unless converted into private knowledgebase, most of such valuable knowledge disappears, say, in six months or so. And the idea of using corporate helpdesk as a knowledge database is in most cases a joke.

The negative part of the book is that the author spreads himself too thin and try to cover too much ground. That means that treatment of most topics became superficial. Also provided examples of shell scripts is more of a classic shell style, not Bash 4.x type of code. That helps portability (if you need it) but does not allow to understand new features of bash 4.x. Bash is available now on most Unixes, such as AIX, Solaris and HP-UX and that solves portability issues in a different, and more productive, way. Portability was killed by systemd anyway unless you want to write wrappers for systemctl related functions ;-)

For an example of author writing, please search for his recent (Oct 30, 2018) article "Working with data streams on the Linux command line" That might give you a better idea of what to expect.

In my view, the book contains enough wisdom to pay $32 for it (Kindle edition price), especially if your can do it at company expense :-). The book is also valuable for educators. Again, the most interesting part is part IV:

Part IV: Becoming Zen 325

Chapter 17: Strive for Elegance 327

Hardware Elegance 327
ThePC8 328
Motherboards 328
Computers 329
Data Centers 329
Power and Grounding 330
Software Elegance 331
Fixing My Web Site 336
Removing Crutt 338
Old or Unused Programs 338
Old Code In Scripts 342
Old Files 343
A Final Word 350

Chapter 18: Find the Simplicity 353

Complexity in Numbers 353
Simplicity In Basics 355
The Never-Ending Process of Simplification 356
Simple Programs Do One Thing 356
Simple Programs Are Small 359
Simplicity and the Philosophy 361
Simplifying My Own Programs 361
Simplifying Others' Programs 362
Uncommented Code 362
Hardware 367
Linux and Hardware 368
The Quandary. 369
The Last Word

Chapter 19: Use Your Favorite Editor 371

More Than Editors 372
Linux Startup 372
Why I Prefer SystemV 373
Why I Prefer systemd 373
The Real Issue 374
Desktop 374
sudo or Not sudo 375
Bypass sudo 376
Valid Uses for sudo 378
A Few Closing Words 379

Chapter 20: Document Everything 381

The Red Baron 382
My Documentation Philosophy 383
The Help Option 383
Comment Code Liberally 384
My Code Documentation Process 387
Man Pages 388
Systems Documentation 388
System Documentation Template 389
Document Existing Code 392
Keep Docs Updated 393
File Compatibility 393
A Few Thoughts 394

Chapter 21: Back Up Everything - Frequently 395

Data Loss 395
Backups to the Rescue 397
The Problem 397
Recovery 404
Doing It My Way 405
Backup Options 405
Off-Site Backups 413
Disaster Recovery Services 414
Other Options 415
What About the "Frequently" Part? 415
Summary 415

Chapter 22: Follow Your Curiosity 417

Charlie 417
Curiosity Led Me to Linux 418
Curiosity Solves Problems 423
Securiosity 423
Follow Your Own Curiosity 440
Be an Author 441
Failure Is an Option 441
Just Do It 442
Summary 443

Chapter 23: There Is No Should 445

There Are Always Possibilities 445
Unleashing the Power 446
Problem Solving 447
Critical Thinking 449
Reasoning to Solve Problems 450
Integrated Reason 453
Self-Knowledge 455
Finding Your Center 455
The Implications of Diversity 456
Measurement Mania 457
The Good Manager 458
Working Together 458
Silo City „..460
The Easy Way 461
Thoughts 462

Chapter 24: Mentor the Young SysAdmins 463

Hiring the Right People 464
Mentoring 465
BRuce the Mentor 466
The Art of Problem Solving 467
The Five Steps ot Problem Solving 467
Knowledge 469
Observation 469
Reasoning 472
Action 473
Test 473
Example 474
Iteration 475
Concluding Thoughts 475

Chapter 25: Support Your Favorite Open Source Project 477

Project Selection 477
Code 478
Test 479
Submit Bug Reports 479
Documentation 480
Assist 481
Teach 482
Write 482
Donate 483
Thoughts 484
Chapter 26: Reality Bytes 485
People 485
The Micromanager 486
More Is Less 487
Tech Support Terror 488
You Should Do It My Way 489
It's OK to Say No 490
The Scientific Method 490
Understanding the Past 491
Final Thoughts 492

[Jun 02, 2021] Simplicity is the core of a good infrastructure by Steve Webb

Dec 04, 2011 | www.badcheese.com

I've seen many infrastructures in my day. I work for a company with a very complicated infrastructure now. They've got a dev/stage/prod environment for every product (and they've got many of them). Trust is not a word spoken lightly here. There is no 'trust' for even sysadmins (I've been working here for 7 months now and still don't have production sudo access). Developers constantly complain about not having the access that they need to do their jobs and there are multiple failures a week that can only be fixed by a small handful of people that know the (very complex) systems in place. Not only that, but in order to save work, they've used every cutting-edge piece of software that they can get their hands on (mainly to learn it so they can put it on their resume, I assume), but this causes more complexity that only a handful of people can manage. As a result of this the site uptime is (on a good month) 3 nines at best.

In my last position (pronto.com) I put together an infrastructure that any idiot could maintain. I used unmanaged switches behind a load-balancer/firewall and a few VPNs around to the different sites. It was simple. It had very little complexity, and a new sysadmin could take over in a very short time if I were to be hit by a bus. A single person could run the network and servers and if the documentation was lost, a new sysadmin could figure it out without much trouble.

Over time, I handed off my ownership of many of the Infrastructure components to other people in the operations group and of course, complexity took over. We ended up with a multi-tier network with bunches of VLANs and complexity that could only be understood with charts, documentation and a CCNA. Now the team is 4+ people and if something happens, people run around like chickens with their heads cut off not knowing what to do or who to contact when something goes wrong.

Complexity kills productivity. Security is inversely proportionate to usability. Keep it simple, stupid. These are all rules to live by in my book.

Downtimes: Beatport: not unlikely to have 1-2 hours downtime for the main site per month.

Pronto: several 10-15 minute outages a year Pronto (under my supervision): a few seconds a month (mostly human error though, no mechanical failure)

[Jun 02, 2021] The System Standards Stockholm Syndrome

John Waclawsky (from Cisco's mobile solutions group), coined the term S4 for "Systems Standards Stockholm Syndrome" - like hostages becoming attached to their captors, systems standard participants become wedded to the process of setting standards for the sake of standards.
It looks like the paper disappeared by there is a book by this author QoS- Myths and Hype eBook by John G. Waclawsky - 9781452463964 - Rakuten Kobo United States
Notable quotes:
"... The "Stockholm Syndrome" describes the behavior of some hostages. The "System Standards Stockholm Syndrome" (S4) describes the behavior of system standards participants who, over time, become addicted to technology complexity and hostages of group thinking. ..."
"... What causes S4? Captives identify with their captors initially as a defensive mechanism, out of fear of intellectual challenges. Small acts of kindness by the captors, such as granting a secretarial role (often called a "chair") to a captive in a working group are magnified, since finding perspective in a systems standards meeting, just like a hostage situation, is by definition impossible. Rescue attempts are problematic, since the captive could become mentally incapacitated by suddenly being removed from a codependent environment. ..."
Jul 22, 2005 | hxr.us

grumpOps

Fri Jul 22 13:56:52 EDT 2005
Category [ Internet Politics ]

This was sent to me by a colleague. From "S4 -- The System Standards Stockholm Syndrome" by John G. Waclawsky, Ph.D.:

The "Stockholm Syndrome" describes the behavior of some hostages. The "System Standards Stockholm Syndrome" (S4) describes the behavior of system standards participants who, over time, become addicted to technology complexity and hostages of group thinking.

Read the whole thing over at BCR .

And while this particularly picks on the ITU types, it should hit close to home to a whole host of other "endeavors".

IMS & Stockholm Syndrome - Light Reading

12:45 PM -- While we flood you with IMS-related content this week, perhaps it's sensible to share some airtime with a clever warning about being held "captive" to the hype.

This warning comes from John G. Waclawsky, PhD, senior technical staff, Wireless Group, Cisco Systems Inc. (Nasdaq: CSCO). Waclawsky, writing in the July issue of Business Communications Review , compares the fervor over IMS to the " Stockholm Syndrome ," a term that comes from a 1973 hostage event in which hostages became sympathetic to their captors.

Waclawsky says a form of the Stockholm Syndrome has taken root in technical standards groups, which he calls "System Standards Stockholm Syndrome," or S4.

Here's a snippet from Waclawsky's column:

What causes S4? Captives identify with their captors initially as a defensive mechanism, out of fear of intellectual challenges. Small acts of kindness by the captors, such as granting a secretarial role (often called a "chair") to a captive in a working group are magnified, since finding perspective in a systems standards meeting, just like a hostage situation, is by definition impossible. Rescue attempts are problematic, since the captive could become mentally incapacitated by suddenly being removed from a codependent environment.

The full article can be found here -- R. Scott Raynovich, US Editor, Light Reading

VoIP and ENUM

Sunday, August 07, 2005 S4 - The Systems Standards Stockholm Syndrome John Waclawsky, part of the Mobile Wireless Group at Cisco Systems, features an interesting article in the July 2005 issue of the Business Communications Review on The Systems Standards Stockholm Syndrome. Since his responsibilities include standards activities (WiMAX, IETF, OMA, 3GPP and TISPAN), identification of product requirements and the definition of mobile wireless and broadband architectures, he seems to know very well what he is talking about, namely the IP Multimedia Subsytem (IMS). See also his article in the June 2005 issue on IMS 101 - What You Need To Know Now .

See also the Wikedpedia glossary from Martin below:

IMS. Internet Monetisation System . A minor adjustment to Internet Protocol to add a "price" field to packet headers. Earlier versions referred to Innovation Minimisation System . This usage is now deprecated. (Expected release Q2 2012, not available in all markets, check with your service provider in case of sudden loss of unmediated connectivity.)
It is so true that I have to cite it completely (bold emphasis added):

The "Stockholm Syndrome" describes the behavior of some hostages. The "System Standards Stockholm Syndrome" (S 4 ) describes the behavior of system standards participants who, over time, become addicted to technology complexity and hostages of group thinking.

Although the original name derives from a 1973 hostage incident in Stockholm, Sweden, the expanded name and its acronym, S 4 , applies specifically to systems standards participants who suffer repeated exposure to cult dogma contained in working group documents and plenary presentations. By the end of a week in captivity, Stockholm Syndrome victims may resist rescue attempts, and afterwards refuse to testify against their captors. In system standards settings, S4 victims have been known to resist innovation and even refuse to compete against their competitors.

Recent incidents involving too much system standards attendance have resulted in people being captured by radical ITU-like factions known as the 3GPP or 3GPP2.

I have to add of course ETSI TISPAN and it seems that the syndrome is also spreading into IETF, especially to SIP and SIPPING.

The victims evolve to unwitting accomplices of the group as they become immune to the frustration of slow plodding progress, thrive on complexity and slowly turn a blind eye to innovative ideas. When released, they continue to support their captors in filtering out disruptive innovation, and have been known to even assist in the creation and perpetuation of bureaucracy.

Years after intervention and detoxification, they often regret their system standards involvement. Today, I am afraid that S 4 cases occur regularly at system standards organizations.

What causes S 4 ? Captives identify with their captors initially as a defensive mechanism, out of fear of intellectual challenges. Small acts of kindness by the captors, such as granting a secretarial role (often called a "chair") to a captive in a working group are magnified, since finding perspective in a systems standards meeting, just like a hostage situation, is by definition impossible. Rescue attempts are problematic, since the captive could become mentally incapacitated by suddenly being removed from a codependent environment.

It's important to note that these symptoms occur under tremendous emotional and/or physical duress due to lack of sleep and abusive travel schedules. Victims of S 4 often report the application of other classic "cult programming" techniques, including:

  1. The encouraged ingestion of mind-altering substances. Under the influence of alcohol, complex systems standards can seem simpler and almost rational.
  2. "Love-fests" in which victims are surrounded by cultists who feign an interest in them and their ideas. For example, "We'd love you to tell us how the Internet would solve this problem!"
  3. Peer pressure. Professional, well-dressed individuals with standing in the systems standards bureaucracy often become more attractive to the captive than the casual sorts commonly seen at IETF meetings.

Back in their home environments, S 4 victims may justify continuing their bureaucratic behavior, often rationalizing and defending their system standard tormentors, even to the extent of projecting undesirable system standard attributes onto component standards bodies. For example, some have been heard murmuring, " The IETF is no picnic and even more bureaucratic than 3GPP or the ITU, " or, "The IEEE is hugely political." (For more serious discussion of component and system standards models, see " Closed Architectures, Closed Systems And Closed Minds ," BCR, October 2004.)

On a serious note, the ITU's IMS (IP Multimedia Subsystem) shows every sign of becoming the latest example of systems standards groupthink. Its concepts are more than seven years old and still not deployed, while its release train lengthens with functional expansions and change requests. Even a cursory inspection of the IMS architecture reveals the complexity that results from:

  1. decomposing every device into its most granular functions and linkages; and
  2. tracking and controlling every user's behavior and related billing.

The proliferation of boxes and protocols, and the state management required for data tracking and control, lead to cognitive overload but little end user value.

It is remarkable that engineers who attend system standards bodies and use modern Internet- and Ethernet-based tools don't apply to their work some of the simplicity learned from years of Internet and Ethernet success: to build only what is good enough, and as simply as possible.

Now here I have to break in: I think the syndrome is also spreading to the IETF, becuase the IETF is starting to leave these principles behind - especially in SIP and SIPPING, not to mention Session Border Confuser (SBC).

The lengthy and detailed effort that characterizes systems standards sometimes produces a bit of success, as the 18 years of GSM development (1980 to 1998) demonstrate. Yet such successes are highly optimized, very complex and thus difficult to upgrade, modify and extend.

Email is a great example. More than 15 years of popular email usage have passed, and today email on wireless is just beginning to approach significant usage by ordinary people.

The IMS is being hyped as a way to reduce the difficulty of integrating new services, when in fact it may do just the opposite. IMS could well inhibit new services integration due to its complexity and related impacts on cost, scalability, reliability, OAM, etc.

Not to mention the sad S 4 effects on all those engineers participating in IMS-related standards efforts.

Here the Wikedpedia glossary from Martin Geddes ( Telepocalypse ) fit in very well:

[Jun 02, 2021] The Basics of the Unix Philosophy - programming

Jun 02, 2021 | www.reddit.com

Gotebe 3 years ago

Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new features.

By now, and to be frank in the last 30 years too, this is complete and utter bollocks. Feature creep is everywhere, typical shell tools are choke-full of spurious additions, from formatting to "side" features, all half-assed and barely, if at all, consistent.

Nothing can resist feature creep. not_perfect_yet 3 years ago

It's still a good idea. It's become very rare though. Many problems we have today are a result of not following it.

name_censored_ 3 years ago
· edited 3 years ago Gold

By now, and to be frank in the last 30 years too, this is complete and utter bollocks.

There is not one single other idea in computing that is as unbastardised as the unix philosophy - given that it's been around fifty years. Heck, Microsoft only just developed PowerShell - and if that's not Microsoft's take on the Unix philosophy, I don't know what is.

In that same time, we've vacillated between thick and thin computing (mainframes, thin clients, PCs, cloud). We've rebelled against at least four major schools of program design thought (structured, procedural, symbolic, dynamic). We've had three different database revolutions (RDBMS, NoSQL, NewSQL). We've gone from grassroots movements to corporate dominance on countless occasions (notably - the internet, IBM PCs/Wintel, Linux/FOSS, video gaming). In public perception, we've run the gamut from clerks ('60s-'70s) to boffins ('80s) to hackers ('90s) to professionals ('00s post-dotcom) to entrepreneurs/hipsters/bros ('10s "startup culture").

It's a small miracle that iproute2 only has formatting options and grep only has --color . If they feature-crept anywhere near the same pace as the rest of the computing world, they would probably be a RESTful SaaS microservice with ML-powered autosuggestions.

badsectoracula 3 years ago

This is because adding a new features is actually easier than trying to figure out how to do it the Unix way - often you already have the data structures in memory and the functions to manipulate them at hand, so adding a --frob parameter that does something special with that feels trivial.

GNU and their stance to ignore the Unix philosophy (AFAIK Stallman said at some point he didn't care about it) while becoming the most available set of tools for Unix systems didn't help either.


level 2

ILikeBumblebees 3 years ago
· edited 3 years ago

Feature creep is everywhere

No, it certainly isn't. There are tons of well-designed, single-purpose tools available for all sorts of purposes. If you live in the world of heavy, bloated GUI apps, well, that's your prerogative, and I don't begrudge you it, but just because you're not aware of alternatives doesn't mean they don't exist.

typical shell tools are choke-full of spurious additions,

What does "feature creep" even mean with respect to shell tools? If they have lots of features, but each function is well-defined and invoked separately, and still conforms to conventional syntax, uses stdio in the expected way, etc., does that make it un-Unixy? Is BusyBox bloatware because it has lots of discrete shell tools bundled into a single binary? nirreskeya 3 years ago

Zawinski's Law :) 1 Share Report Save

icantthinkofone -34 points· 3 years ago
More than 1 child
waivek 3 years ago

The (anti) foreword by Dennis Ritchie -

I have succumbed to the temptation you offered in your preface: I do write you off as envious malcontents and romantic keepers of memories. The systems you remember so fondly (TOPS-20, ITS, Multics, Lisp Machine, Cedar/Mesa, the Dorado) are not just out to pasture, they are fertilizing it from below.

Your judgments are not keen, they are intoxicated by metaphor. In the Preface you suffer first from heat, lice, and malnourishment, then become prisoners in a Gulag. In Chapter 1 you are in turn infected by a virus, racked by drug addiction, and addled by puffiness of the genome.

Yet your prison without coherent design continues to imprison you. How can this be, if it has no strong places? The rational prisoner exploits the weak places, creates order from chaos: instead, collectives like the FSF vindicate their jailers by building cells almost compatible with the existing ones, albeit with more features. The journalist with three undergraduate degrees from MIT, the researcher at Microsoft, and the senior scientist at Apple might volunteer a few words about the regulations of the prisons to which they have been transferred.

Your sense of the possible is in no sense pure: sometimes you want the same thing you have, but wish you had done it yourselves; other times you want something different, but can't seem to get people to use it; sometimes one wonders why you just don't shut up and tell people to buy a PC with Windows or a Mac. No Gulag or lice, just a future whose intellectual tone and interaction style is set by Sonic the Hedgehog. You claim to seek progress, but you succeed mainly in whining.

Here is my metaphor: your book is a pudding stuffed with apposite observations, many well-conceived. Like excrement, it contains enough undigested nuggets of nutrition to sustain life for some. But it is not a tasty pie: it reeks too much of contempt and of envy.

Bon appetit!

[Jun 02, 2021] New technology is not always a sign of progress by Maija Palme

Sep 7, 2016 | FT.com

Is the computer the least efficient machine humans have ever built? Technology journalists often unthinkingly pick up a narrative of progress in which each generation of technology is an improvement on the last, from abacus to iPhone. We marvel that we carry more computing power in our pockets than was used to put a man on the moon in 1969.

What we have at our fingertips is smaller, faster and more complicated than before. But is it necessarily better?

In his new book The Bleeding Edge, Bob Hughes, an activist and former academic, takes a refreshingly critical look at assumptions about technology - the subtitle is "Why technology turns toxic in an unequal world".

... ... ...

In the computer age, we are similarly spun into cycles of obsolescence and upgrades that benefit us little but which are difficult to opt out of. Anyone still mourning the loss of their BlackBerry to an iPhone may feel a stab of sympathy when they read Mr Hughes.

The economics of microchip production - where factories must operate at enormous scale and only the very latest products make a profit - dictates a relentless pace of device upgrades, regardless of what consumers really need.

Understanding this helps to explain the mysterious "productivity paradox" - the fact that all the new computer and mobile technology of the past 20 years has not led to an increase in productivity. Employees must constantly learn new ways to perform the same task over and over again as technology changes. However, this does not necessarily increase the speed at which jobs are done.

... ... ...

The Bleeding Edge , by Bob Hughes, New Internationalist Publications, RRP£10.99, 336 pages

reader 7 days ago

Social media is an example of technology creating new ways to waste time.

sourcex 8 days ago

The only interesting thing from this article worth contemplating is the fact that even if the technology progresses, the amount of time and productivity of work remaining more or less constant

Spaven 13 days ago

My current work computer is 8000 times as powerful as the one I had twenty years ago, but both took 6 minutes to start up in the morning

ZmeiGorynych Sep 7, 2016

Sounds rather incoherent - firstly, who says only one process can happen at a time? Every phone I've owned for years has been multi-core, and servers can have dozens of processors.

Secondly, the reason analogue computers aren't widely used is they're very difficult to re-program, so can be very good at specific tasks but terrible at general-purpose computing.

Overall sounds like a book written by someone who doesn't really understand half of the technology he's writing about, but doesn't let that stand in the way of the points he wants to make.

A critique of the social impacts of technological progress over the last 20 years might have been more interesting.

[Nov 28, 2017] Sometimes the Old Ways Are Best by Brian Kernighan

Notable quotes:
"... Sometimes the old ways are best, and they're certainly worth knowing well ..."
Nov 01, 2008 | IEEE Software, pp.18-19

As I write this column, I'm in the middle of two summer projects; with luck, they'll both be finished by the time you read it.

... ... ...

Here has surely been much progress in tools over the 25 years that IEEE Software has been around, and I wouldn't want to go back in time.

But the tools I use today are mostly the same old ones-grep, diff, sort, awk, and friends. This might well mean that I'm a dinosaur stuck in the past.

On the other hand, when it comes to doing simple things quickly, I can often have the job done while experts are still waiting for their IDE to start up. Sometimes the old ways are best, and they're certainly worth knowing well

Recommended Links

Google matched content

Softpanorama Recommended

Top articles

Sites

General

Softpanorama:

Fight against modern fads

The laws of software development

Algorithms

Donald Knuth

Melvin Conway -- the inventor of coroutine concept -- a really revolutionary idea of programming and the author of Conway Law

Books



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: August 16, 2021