Softpanorama

Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
May the source be with you, but remember the KISS principle ;-)
Bigger doesn't imply better. Bigger often is a sign of obesity, of lost control, of overcomplexity, of cancerous cells

AWK one liners

News

AWK Programming

Best Shell Books

Recommended Links AWK Regular expressions String Operations in Shell

Pipes -- powerful and elegant programming paradigm

Shell Input and Output Redirection

Perl One Liners BASH Debugging Shell Giants  

Introduction to the Unix shell history

Bourne Shells Family

Sysadmin Horror Stories Tips Humor Etc

Introduction

One great blunder of Unix developers was that they did not integrated AWK into shell. Actually development team which developed AWK technically was heads above shell developers before David Korn.  David Korn was a talented developer,  but he came too late to change the situation.  Later the attempt to integrate shell and AWk was attempted in Perl, which can be used instead of AWK in some cases. See Perl One Liners. But while AWK is simple and elegant, Perl suffers from overcomplexity.

Simple AWK programs enclosed in single quotes can be passed as a parameter. For example:

awk 'BEGIN { FS = ":" } { print $1 | "sort" }' /etc/passwd
This program
 BEGIN { FS = ":" } { print $1 | "sort" }

prints a sorted list of the login names of all users from /etc/passwd.

If an input file or output file are not specified, AWK will expect input from stdin or output to stdout.

AWK was pioneer in introducing regular expression to Unix See AWK Regular expressions. But pattern matching capabilities of AWK are not limited to regular expression. Patterns can be

  1. regular expressions enclosed by slashes, e.g.: /[a-z]+/
  2. relational expressions, e.g.: $3!=$4
  3. pattern-matching expressions, e.g.: $1 !~ /string/
  4. or any combination of these, e.g.:
    (substr($0,5,2)=="xx" && $3 ~ /nasty/ ) || /^The/ || /$mean/ || $4>2

(This last example selects lines where the two characters starting in fifth column are xx and the third field matches nasty, plus lines beginning with The, plus lines ending with mean, plus lines in which the fourth field is greater than two.)

AWK procedures are enclosed in { curly brackets }. Procedures can

Assign variables or arrays

For example:

AWK operators by order of (decreasing) precedence

When creating complex expressions in AWK you need to understand the precedence of operators or use brackets (safer and simpler way ;-). Don't try to save on brackets !

Here is the precedence of operators n AWK:

 Field reference:			$
  Increment or decrement:		++ --
  Exponentiate:				^
  Multiply, divide, modulus:		* / %
  Add, subtract:			+ -
  Concatenation:			(blank space)
  Relational:				< <= > >= != ==
  Match regular expression:		~ !~
  Logical:				&& || 
  C-style assignment:			= += -= *= /= %= ^=

AWK arithmetic functions

AWK has most functions available in C. For example

exp, int, log and sqrt

AWK string functions

Print output

 Output can be unformatted (print) or formatted (printf):

Perform flow-control

Do-loops:

for   ( [initial expression]; 
		[test expression]; 
		[increment counter expression] ) 
	{ commands } 

example:

for (i = 1; i <= 20; i++)
does 20 iterations

If-Then-Else:

if (condition) 
		{ commands1 } 
	[ else 
		{ commands2 } ] 

does commands1 if condition is true; commands2 (or nothing) if false; condition is any expression with relational or pattern-match operators.

Other flow-control commands:

The following command runs a simple awk  program that searches the input file /etc/passwd for the character string `foo' (a grouping of characters is usually called a string; the term string is based on similar usage in English, such as “a string of pearls,” or “a string of cars in a train”):

awk '/foo/ { print $0 }' /etc/passwd

When lines containing `foo' are found, they are printed because `print $0' means print the current line. (Just `print' by itself means the same thing, so we could have written that instead.)

You will notice that slashes (`/') surround the string `foo' in the awk  program. The slashes indicate that `foo' is the pattern to search for. This type of pattern is called a regular expression, which is covered in more detail later (see Regexp). The pattern is allowed to match parts of words. There are single quotes around the awk  program so that the shell won't interpret any of it as special shell characters.

Here is what this program prints:

 $ awk '/foo/ { print $0 }' BBS-list
     -| fooey        555-1234     2400/1200/300     B
     -| foot         555-6699     1200/300          B
     -| macfoo       555-6480     1200/300          A
     -| sabafoo      555-2127     1200/300          C

In an awk  rule, either the pattern or the action can be omitted, but not both. If the pattern is omitted, then the action is performed for every input line. If the action is omitted, the default action is to print all lines that match the pattern.

Thus, we could leave out the action (the print  statement and the curly braces) in the previous example and the result would be the same: all lines matching the pattern `foo' are printed. By comparison, omitting the print  statement but retaining the curly braces makes an empty action that does nothing (i.e., no lines are printed).

Many practical awk  programs are just a line or two. Following is a collection of useful, short programs to get you started. Some of these programs contain constructs that haven't been covered yet. (The description of the program will give you a good idea of what is going on, but please read the rest of the Web page to become an awk  expert!) Most of the examples use a data file named data. This is just a placeholder; if you use these programs yourself, substitute your own file names for data. For future reference, note that there is often more than one way to do things in awk. At some point, you may want to look back at these examples and see if you can come up with different ways to do the same things shown here:


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

The AWK Manual - One-liners Useful "One-liners"

Useful awk programs are often short, just a line or two. Here is a collection of useful, short programs to get you started. Some of these programs contain constructs that haven't been covered yet. The description of the program will give you a good idea of what is going on, but please read the rest of the manual to become an awk expert!

awk '{ if (NF > max) max = NF }
END { print max }'
This program prints the maximum number of fields on any input line.
awk 'length($0) > 80'
This program prints every line longer than 80 characters. The sole rule has a relational expression as its pattern, and has no action (so the default action, printing the record, is used).
awk 'NF > 0'
This program prints every line that has at least one field. This is an easy way to delete blank lines from a file (or rather, to create a new file similar to the old file but from which the blank lines have been deleted).
awk '{ if (NF > 0) print }'
This program also prints every line that has at least one field. Here we allow the rule to match every line, then decide in the action whether to print.
awk 'BEGIN { for (i = 1; i <= 7; i++)
print int(101 * rand()) }'
This program prints 7 random numbers from 0 to 100, inclusive.
ls -l files | awk '{ x += $4 } ; END { print "total bytes: " x }'
This program prints the total number of bytes used by files.
expand file | awk '{ if (x < length()) x = length() }
END { print "maximum line length is " x }'
This program prints the maximum line length of file. The input is piped through the expand program to change tabs into spaces, so the widths compared are actually the right-margin columns.
awk 'BEGIN { FS = ":" } { print $1 | "sort" }' /etc/passwd
This program prints a sorted list of the login names of all users.
awk '{ nlines++ }
END { print nlines }'
This programs counts lines in a file.
awk 'END { print NR }'
This program also counts lines in a file, but lets awk do the work.
awk '{ print NR, $0 }'
This program adds line numbers to all its input files, similar to `cat -n'.

HANDY ONE-LINERS FOR AWK 22 July 2003 compiled by Eric Pement ...

HANDY ONE-LINERS FOR AWK 22 July 2003

compiled by Eric Pement <pemente@northpark.edu> version 0.22 Latest version of this file is usually at:

http://www.student.northpark.edu/pemente/awk/awk1line.txt

USAGE:

Most of my experience comes from version of GNU awk (gawk) compiled for Win32. Note in particular that DJGPP compilations permit the awk script to follow Unix quoting syntax '/like/ {"this"}'. However, the user must know that single quotes under DOS/Windows do not protect the redirection arrows (<, >) nor do they protect pipes (|). Both are special symbols for the DOS/CMD command shell and their special meaning is ignored only if they are placed within "double quotes." Likewise, DOS/Win users must remember that the percent sign (%) is used to mark DOS/Win environment variables, so it must be doubled (%%) to yield a single percent sign visible to awk.

If I am sure that a script will NOT need to be quoted in Unix, DOS, or CMD, then I normally omit the quote marks. If an example is peculiar to GNU awk, the command 'gawk' will be used. Please notify me if you find errors or new commands to add to this list (total length under 65 characters). I usually try to put the shortest script first.

FILE SPACING:

 # double space a file
 awk '1;{print ""}'
 awk 'BEGIN{ORS="\n\n"};1'

 # double space a file which already has blank lines in it. Output file
 # should contain no more than one blank line between lines of text.
 # NOTE: On Unix systems, DOS lines which have only CRLF (\r\n) are
 # often treated as non-blank, and thus 'NF' alone will return TRUE.
 awk 'NF{print $0 "\n"}'

 # triple space a file
 awk '1;{print "\n"}'

NUMBERING AND CALCULATIONS:

 # precede each line by its line number FOR THAT FILE (left alignment).
 # Using a tab (\t) instead of space will preserve margins.
 awk '{print FNR "\t" $0}' files*

 # precede each line by its line number FOR ALL FILES TOGETHER, with tab.
 awk '{print NR "\t" $0}' files*

 # number each line of a file (number on left, right-aligned)
 # Double the percent signs if typing from the DOS command prompt.
 awk '{printf("%5d : %s\n", NR,$0)}'

 # number each line of file, but only print numbers if line is not blank
 # Remember caveats about Unix treatment of \r (mentioned above)
 awk 'NF{$0=++a " :" $0};{print}'
 awk '{print (NF? ++a " :" :"") $0}'

 # count lines (emulates "wc -l")
 awk 'END{print NR}'

 # print the sums of the fields of every line
 awk '{s=0; for (i=1; i<=NF; i++) s=s+$i; print s}'

 # add all fields in all lines and print the sum
 awk '{for (i=1; i<=NF; i++) s=s+$i}; END{print s}'

 # print every line after replacing each field with its absolute value
 awk '{for (i=1; i<=NF; i++) if ($i < 0) $i = -$i; print }'
 awk '{for (i=1; i<=NF; i++) $i = ($i < 0) ? -$i : $i; print }'

 # print the total number of fields ("words") in all lines
 awk '{ total = total + NF }; END {print total}' file

 # print the total number of lines that contain "Beth"
 awk '/Beth/{n++}; END {print n+0}' file

 # print the largest first field and the line that contains it
 # Intended for finding the longest string in field #1
 awk '$1 > max {max=$1; maxline=$0}; END{ print max, maxline}'

 # print the number of fields in each line, followed by the line
 awk '{ print NF ":" $0 } '

 # print the last field of each line
 awk '{ print $NF }'

 # print the last field of the last line
 awk '{ field = $NF }; END{ print field }'

 # print every line with more than 4 fields
 awk 'NF > 4'

 # print every line where the value of the last field is > 4
 awk '$NF > 4'

TEXT CONVERSION AND SUBSTITUTION:

 # IN UNIX ENVIRONMENT: convert DOS newlines (CR/LF) to Unix format
 awk '{sub(/\r$/,"");print}'   # assumes EACH line ends with Ctrl-M

 # IN UNIX ENVIRONMENT: convert Unix newlines (LF) to DOS format
 awk '{sub(/$/,"\r");print}

 # IN DOS ENVIRONMENT: convert Unix newlines (LF) to DOS format
 awk 1

 # IN DOS ENVIRONMENT: convert DOS newlines (CR/LF) to Unix format
 # Cannot be done with DOS versions of awk, other than gawk:
 gawk -v BINMODE="w" '1' infile >outfile

 # Use "tr" instead.
 tr -d \r <infile >outfile            # GNU tr version 1.22 or higher

 # delete leading whitespace (spaces, tabs) from front of each line
 # aligns all text flush left
 awk '{sub(/^[ \t]+/, ""); print}'

 # delete trailing whitespace (spaces, tabs) from end of each line
 awk '{sub(/[ \t]+$/, "");print}'

 # delete BOTH leading and trailing whitespace from each line
 awk '{gsub(/^[ \t]+|[ \t]+$/,"");print}'
 awk '{$1=$1;print}'           # also removes extra space between fields

 # insert 5 blank spaces at beginning of each line (make page offset)
 awk '{sub(/^/, "     ");print}'

 # align all text flush right on a 79-column width
 awk '{printf "%79s\n", $0}' file*

 # center all text on a 79-character width
 awk '{l=length();s=int((79-l)/2); printf "%"(s+l)"s\n",$0}' file*

 # substitute (find and replace) "foo" with "bar" on each line
 awk '{sub(/foo/,"bar");print}'           # replaces only 1st instance
 gawk '{$0=gensub(/foo/,"bar",4);print}'  # replaces only 4th instance
 awk '{gsub(/foo/,"bar");print}'          # replaces ALL instances in a line

 # substitute "foo" with "bar" ONLY for lines which contain "baz"
 awk '/baz/{gsub(/foo/, "bar")};{print}'

 # substitute "foo" with "bar" EXCEPT for lines which contain "baz"
 awk '!/baz/{gsub(/foo/, "bar")};{print}'

 # change "scarlet" or "ruby" or "puce" to "red"
 awk '{gsub(/scarlet|ruby|puce/, "red"); print}'

 # reverse order of lines (emulates "tac")
 awk '{a[i++]=$0} END {for (j=i-1; j>=0;) print a[j--] }' file*

 # if a line ends with a backslash, append the next line to it
 # (fails if there are multiple lines ending with backslash...)
 awk '/\\$/ {sub(/\\$/,""); getline t; print $0 t; next}; 1' file*

 # print and sort the login names of all users
 awk -F ":" '{ print $1 | "sort" }' /etc/passwd

 # print the first 2 fields, in opposite order, of every line
 awk '{print $2, $1}' file

 # switch the first 2 fields of every line
 awk '{temp = $1; $1 = $2; $2 = temp}' file

 # print every line, deleting the second field of that line
 awk '{ $2 = ""; print }'

 # print in reverse order the fields of every line
 awk '{for (i=NF; i>0; i--) printf("%s ",i);printf ("\n")}' file

 # remove duplicate, consecutive lines (emulates "uniq")
 awk 'a !~ $0; {a=$0}'

 # remove duplicate, nonconsecutive lines
 awk '! a[$0]++'                     # most concise script
 awk '!($0 in a) {a[$0];print}'      # most efficient script

 # concatenate every 5 lines of input, using a comma separator
 # between fields
 awk 'ORS=%NR%5?",":"\n"' file

SELECTIVE PRINTING OF CERTAIN LINES:

 # print first 10 lines of file (emulates behavior of "head")
 awk 'NR < 11'

 # print first line of file (emulates "head -1")
 awk 'NR>1{exit};1'

  # print the last 2 lines of a file (emulates "tail -2")
 awk '{y=x "\n" $0; x=$0};END{print y}'

 # print the last line of a file (emulates "tail -1")
 awk 'END{print}'

 # print only lines which match regular expression (emulates "grep")
 awk '/regex/'

 # print only lines which do NOT match regex (emulates "grep -v")
 awk '!/regex/'

 # print the line immediately before a regex, but not the line
 # containing the regex
 awk '/regex/{print x};{x=$0}'
 awk '/regex/{print (x=="" ? "match on line 1" : x)};{x=$0}'

 # print the line immediately after a regex, but not the line
 # containing the regex
 awk '/regex/{getline;print}'

 # grep for AAA and BBB and CCC (in any order)
 awk '/AAA/; /BBB/; /CCC/'

 # grep for AAA and BBB and CCC (in that order)
 awk '/AAA.*BBB.*CCC/'

 # print only lines of 65 characters or longer
 awk 'length > 64'

 # print only lines of less than 65 characters
 awk 'length < 64'

 # print section of file from regular expression to end of file
 awk '/regex/,0'
 awk '/regex/,EOF'

 # print section of file based on line numbers (lines 8-12, inclusive)
 awk 'NR==8,NR==12'

 # print line number 52
 awk 'NR==52'
 awk 'NR==52 {print;exit}'          # more efficient on large files

 # print section of file between two regular expressions (inclusive)
 awk '/Iowa/,/Montana/'             # case sensitive
SELECTIVE DELETION OF CERTAIN LINES:
 # delete ALL blank lines from a file (same as "grep '.' ")
 awk NF
 awk '/./'
CREDITS AND THANKS:

Special thanks to Peter S. Tillier for helping me with the first release of this FAQ file.

For additional syntax instructions, including the way to apply editing commands from a disk file instead of the command line, consult:

"sed & awk, 2nd Edition," by Dale Dougherty and Arnold Robbins O'Reilly, 1997

"UNIX Text Processing," by Dale Dougherty and Tim O'Reilly Hayden Books, 1987

"Effective awk Programming, 3rd Edition." by Arnold Robbins O'Reilly, 2001

To fully exploit the power of awk, one must understand "regular expressions." For detailed discussion of regular expressions, see "Mastering Regular Expressions, 2d edition" by Jeffrey Friedl (O'Reilly, 2002).

The manual ("man") pages on Unix systems may be helpful (try "man awk", "man nawk", "man regexp", or the section on regular expressions in "man ed"), but man pages are notoriously difficult. They are not written to teach awk use or regexps to first-time users, but as a reference text for those already acquainted with these tools.

USE OF '\t' IN awk SCRIPTS: For clarity in documentation, we have used the expression '\t' to indicate a tab character (0x09) in the scripts. All versions of awk, even the UNIX System 7 version should recognize the '\t' abbreviation.

AWK One-Liners

Although awk can be used to write programs of some complexity, many useful programs are not complicated. Here is a collection of short programs that you might find handy and/or instructive:
  1. Print the total number of input lines:
    END { print NR }
  2. Print the tenth input line:
    NR == 10
  3. Print the last field of every input line:
    { print $NF }
  4. Print the last field of the last input line:
    { field = $NF}
    END { print field } 
  5. Print every input line with more than 4 fields:
    NF > 4
  6. Print every input line in which the last field is more than 4:
    $NF > 4
  7. Print the total number of fields in all input lines:
    { nf = nf + NF }
    END { print nf }      
  8. Print the total number of lines that contain Beth:
    /Beth/ { nlines = nlines + 1 }
    END { print nlines }      
  9. Print the largest first fields and the line that contains it ( assumes some $1 is positive):
    $1 > max { max = $1 ; maxlines = $0 }
    END { print max, maxline)
           
  10. Print every line that has at least one field:
    NF > 0
  11. Pritn every line longer than 80 characters:
    length($0) > 80
  12. Print the numer of fields in every line, followed by the line itself:
    { print NF, $0 }
  13. Print the first two fields, in opposite order, of every line:
    { print $2, $1 }
  14. Exchange the first two fields of every line and then print the line:
    { temp = $1 ; $1 = $2 ; $2 = temp ; print }
  15. Print every line witg rge first field replaced by the line number:
    { $1 = NR ; print }
  16. Print every line after erasing the second field:
    { $2 = ""; print }
  17. Print in reverse order the fields of every line:
    { for (i=NF ; i>0 ; i=i-1) printf( "%s ", $1)
           printf("\n")
    }       
  18. Print the sums of the fields of every line:
    { sum = 0
           for ( i=1 ; i<=NF ; i=i+1) sum = sum + $i
           print sum
    }       
  19. Ad up all fields in all lines and print the sum:
    { for ( i=1 ; i<=NF ; i=i+1 ) sum = sum + $i}
    END { print sum }       
           
  20. Print every line after replacing each field by its absolute value:
    { for (i=1 ; i<=NF ; i=i+1) if ($i<0) $i=-$i
           print
    }       
Source: The AWK Programming Language

One-liners

Useful "One-liners"
*******************

Useful `awk' programs are often short, just a line or two. Here is a
collection of useful, short programs to get you started. Some of these
programs contain constructs that haven't been covered yet. The
description of the program will give you a good idea of what is going
on, but please read the rest of the manual to become an `awk' expert!

Since you are reading this in Info, each line of the example code is
enclosed in quotes, to represent text that you would type literally.
The examples themselves represent shell commands that use single quotes
to keep the shell from interpreting the contents of the program. When
reading the examples, focus on the text between the open and close
quotes.

`awk '{ if (NF > max) max = NF }'
` END { print max }''
This program prints the maximum number of fields on any input line.

`awk 'length($0) > 80''
This program prints every line longer than 80 characters. The sole
rule has a relational expression as its pattern, and has no action
(so the default action, printing the record, is used).

`awk 'NF > 0''
This program prints every line that has at least one field. This
is an easy way to delete blank lines from a file (or rather, to
create a new file similar to the old file but from which the blank
lines have been deleted).

`awk '{ if (NF > 0) print }''
This program also prints every line that has at least one field.
Here we allow the rule to match every line, then decide in the
action whether to print.

`awk 'BEGIN { for (i = 1; i <= 7; i++)'
` print int(101 * rand()) }''
This program prints 7 random numbers from 0 to 100, inclusive.

`ls -l FILES | awk '{ x += $4 } ; END { print "total bytes: " x }''
This program prints the total number of bytes used by FILES.

`expand FILE | awk '{ if (x < length()) x = length() }'
` END { print "maximum line length is " x }''
This program prints the maximum line length of FILE. The input is
piped through the `expand' program to change tabs into spaces, so
the widths compared are actually the right-margin columns.

`awk 'BEGIN { FS = ":" }'
` { print $1 | "sort" }' /etc/passwd'
This program prints a sorted list of the login names of all users.

`awk '{ nlines++ }'
` END { print nlines }''
This programs counts lines in a file.

`awk 'END { print NR }''
This program also counts lines in a file, but lets `awk' do the
work.

`awk '{ print NR, $0 }''
This program adds line numbers to all its input files, similar to
`cat -n'.

The GAWK Manual - Useful One-liners

Useful awk programs are often short, just a line or two. Here is a collection of useful, short programs to get you started. Some of these programs contain constructs that haven't been covered yet. The description of the program will give you a good idea of what is going on, but please read the rest of the manual to become an awk expert!

awk '{ if (NF > max) max = NF }
END { print max }'
This program prints the maximum number of fields on any input line.
awk 'length($0) > 80'
This program prints every line longer than 80 characters. The sole rule has a relational expression as its pattern, and has no action (so the default action, printing the record, is used).
awk 'NF > 0'
This program prints every line that has at least one field. This is an easy way to delete blank lines from a file (or rather, to create a new file similar to the old file but from which the blank lines have been deleted).
awk '{ if (NF > 0) print }'
This program also prints every line that has at least one field. Here we allow the rule to match every line, then decide in the action whether to print.
awk 'BEGIN { for (i = 1; i <= 7; i++)
print int(101 * rand()) }'
This program prints 7 random numbers from 0 to 100, inclusive.
ls -l files | awk '{ x += $4 } ; END { print "total bytes: " x }'
This program prints the total number of bytes used by files.
expand file | awk '{ if (x < length()) x = length() }
END { print "maximum line length is " x }'
This program prints the maximum line length of file. The input is piped through the expand program to change tabs into spaces, so the widths compared are actually the right-margin columns.
awk 'BEGIN { FS = ":" }
{ print $1 | "sort" }' /etc/passwd
This program prints a sorted list of the login names of all users.
awk '{ nlines++ }
END { print nlines }'
This programs counts lines in a file.
awk 'END { print NR }'
This program also counts lines in a file, but lets awk do the work.
awk '{ print NR, $0 }'
This program adds line numbers to all its input files, similar to `cat -n'

Top Visited
Switchboard
Latest
Past week
Past month

Recommended Links

Google matched content

Softpanorama Recommended

Top articles

Sites

The AWK Programming Language

The awk programming language

-home-jpc1-rpmfind-redhat-5.2-sparc-usr_share_awk_Tree.html

Index of -system-monitor-2.1.2-scripts-awk

Index of /sw/sun4_56/gawk-3.0.3/lib/awk



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least


Copyright © 1996-2018 by Dr. Nikolai Bezroukov. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) in the author free time and without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

 

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: September 12, 2017