May the source be with you, but remember the KISS principle ;-)

Contents Bulletin Scripting in shell and Perl Network troubleshooting History Humor

Bash as an Enterprise-level Shell


See also

Best Shell Books

Recommended Links  Unix shell history




Annotated List of Bash Enhancements

Bash on HP-UX Bash on AIX Arithmetic expressions if statements in shell Case statement in shell String Operations Bash select statement Readline and inputrc Pushd, popd and dirs

Comparison operators in shell

Control structures


Bash Variables



Pipes in Loops

Double square bracket conditionals

Shell Conditional Operators Caveats

 Dot files KSH Substitutions Advanced navigation Command Completion readline vi mode exec command

sort command

tr command
Bash history and bang commands Command history reuse IFS cdpath VIM Tips Scripts collections Strange Files Deletion Shell Input and Output Redirection  Portability
Shell Prompts  Pretty Printing BASH Debugging Debugging bash Tips and Tricks Bash Development History Humor Random Findings Etc

Bash is a standard shell on Linux and with version 3.2 or later available on all enterprise platform and installed on Solaris and AIX it make sense to make it standard interactive shell. Only HP-UX does not include bash by-default, but HP does provide a depot package for bash 4.00.   Bash 3.2 and later has important enhancements that make it a better shell (see Annotated List of Bash Enhancements) although scripts should be generally limited to small to medium size as Perl represents much better scripting environment then bash and is installed by default on all platforms including HP-UX and AIX. 

While bash has a humble start and much weaker designer then ksh93 and was at times extremely buggy, the gap narrowed in version 3.2 to the extent that better command line user friendliness of bash make it more attractive, especially as an interactive shell then supplied with OS ksh and C-shell. Most ksh93 innovations are now present in some form in bash 3.2.  As bash is standard shell on Linux and is installed by default in Solaris it status as an enterprise shell is almost as strong as ksh93 (which is mostly present in old, weaker versions. Current bash has the best debugger and from this point of view represents the best shell. But portable script still are probably better to be written for ksh88 or POSIX shell which is lowest common denominator available on all Unixes. To write scripts in Borne shell now is extremely stupid and wasteful.  

Bash 3.2 and later is the one of the most portable advanced shell around (ksh93 and zsh are still a strong competition; ksh93 is definitely more reliable for scripts and does not contain such design blunders as the last stage of the pipe belonging to subshell instead of the invoking shell ). 

Bash-related books dominates shell-related publications and as such the level of known-how for bash is higher then for other shells (see Best Shell Books). The advantage of bash for large enterprise environment is that it comes by default with linux, Solaris and AIX (unfortunately in pretty different versions). Only HP-UX does not have bash installed by default. Also it is the best portable interactive shell, much closer to tcsh then any competition. 

Still you need some efforts to make the default shell. Unfortunately the default shell for Solaris is the "Bourne shell" or /usr/bin/sh

Bourne shell is a pretty weak outdated shell and attempt to base shell scripting portability on this outdated shell is a serious strategic error (bash probably should be used instead). It is still used as root shell in Solaris but that's due to Solaris legacy not because it gives anything but disadvantages; attempts to claim that this somehow increases the security of root due to the fact that it is static linked are weak and the argumentation is open for discussion. All-in-all usage of Bourne shell as a default root shell in Solaris might be considered to be a blunder: system administrators definitely need a better shell.

Usage of Borne shall as a default shell might slightly increase the chances of recovery in case /usr  partition is damaged, but this is a pretty serious case and usually means serious troubles with other partitions on the disk anyway (unless this is the case when in Solaris link /bin -> usr/bin is destroyed, but such cases are simple to fight by refereeing shell as /usr/bin/ksh in /etc/passwd). If this is a serious trouble than booting from a CD a mounting the damaged volume is always a good idea and in this case it does not matter what shell root is using; you can change it anyway.

Bash 3.2 is reasonably easy to build from source. Get and unpack in your home directory the archive. This will create a directory called bash-3.2 in your home directory. If you do not have the gunzip utility, you can obtain it in the same way you obtained bash or simply use gzip -d instead. The archive contains all of the source code needed to compile bash and a large amount of documentation and examples. the latter have their own value.

The bash archive contains a main directory and a set of files and subdirectories. Among the first files you should examine are:

CHANGES A comprehensive list of bug fixes and new features since the last version

COPYING The GNU Copyleft for bash

MANIFEST A list of all the files and directories in the archive

NEWS A list of new features since the last version

README A short introduction and instructions for compiling bash

doc Directory with information related to bash in various formats

examples Directory with examples of startup files, scripts, and functions

The doc directory contains a few articles that are worth reading. You can print man page for reference with command  nroff-man bash.1 | more  It is convenient to have a hardcopy so you can write notes all over it. Also valuable, FAQ is a Frequently Asked Questions document with answers, readline.3 is the manual entry for the readline facility, and is an article about the shell that appeared in Linux Journal, and was written by the current bash maintainer Chet Ramey. 

An examples directory is especially important and is well worth exploring (after you've finished reading this book, of course). It includes sample code, scripts, functions, and startup files. See Examples shipped with bash 3.2 and newer

An examples directory directory is especially important and is well worth exploring (after you've finished reading this book, of course). It includes sample code, scripts, functions, and startup files. See Examples shipped with bash 3.2 and newer

Some interesting features of bash include:

Due to this bash shell is gradually gaining grounds as the preferred interactive shell for Solaris and other enterprise class Unixes.

Older version of bash (2.x series) are obsolete and should not be used. The recommended version is 3.2 patch level 3 or above.  The latter is a recommended version as it's more compatible with ksh93. It supports many important enhancements introduced by ksh93.

Dr. Nikolai Bezroukov


Old News ;-)

[Feb 14, 2017] Ms Dos style aliases for linux

I think alias ipconfig = 'ifconfig' is really useful for people who work with Linus from Windows POc desktop/laptop.
Feb 14, 2017 |
# MS-DOS / XP cmd like stuff  
   alias edit = $VISUAL  
   alias copy = 'cp'  
   alias cls = 'clear'  
   alias del = 'rm'  
   alias dir = 'ls'  
   alias md = 'mkdir'  
   alias move = 'mv'  
   alias rd = 'rmdir'  
   alias ren = 'mv'  
   alias ipconfig = 'ifconfig'

[Feb 04, 2017] Quickly find differences between two directories

You will be surprised, but GNU diff use in Linux understands the situation when two arguments are directories and behaves accordingly
Feb 04, 2017 |

The diff command compare files line by line. It can also compare two directories:

# Compare two folders using diff ##
diff /etc /tmp/etc_old  
Rafal Matczak September 29, 2015, 7:36 am
§ Quickly find differences between two directories
And quicker:
 diff -y <(ls -l ${DIR1}) <(ls -l ${DIR2})  

[Feb 04, 2017] Use CDPATH to access frequent directories in bash - Mac OS X Hints

Feb 04, 2017 |
The variable CDPATH defines the search path for the directory containing directories. So it served much like "directories home". The dangers are in creating too complex CDPATH. Often a single directory works best. For example export CDPATH = /srv/www/public_html . Now, instead of typing cd /srv/www/public_html/CSS I can simply type: cd CSS
Use CDPATH to access frequent directories in bash UNIX
Mar 21, '05 10:01:00AM • Contributed by: jonbauman

I often find myself wanting to cd to the various directories beneath my home directory (i.e. ~/Library, ~/Music, etc.), but being lazy, I find it painful to have to type the ~/ if I'm not in my home directory already. Enter CDPATH , as desribed in man bash ):

The search path for the cd command. This is a colon-separated list of directories in which the shell looks for destination directories specified by the cd command. A sample value is ".:~:/usr".
Personally, I use the following command (either on the command line for use in just that session, or in .bash_profile for permanent use):

This way, no matter where I am in the directory tree, I can just cd dirname , and it will take me to the directory that is a subdirectory of any of the ones in the list. For example:
$ cd
$ cd Documents 
$ cd Pictures
$ cd Preferences

[ robg adds: No, this isn't some deeply buried treasure of OS X, but I'd never heard of the CDPATH variable, so I'm assuming it will be of interest to some other readers as well.]

cdable_vars is also nice
Authored by: clh on Mar 21, '05 08:16:26PM

Check out the bash command shopt -s cdable_vars

From the man bash page:


If set, an argument to the cd builtin command that is not a directory is assumed to be the name of a variable whose value is the directory to change to.

With this set, if I give the following bash command:

export d="/Users/chap/Desktop"

I can then simply type

cd d

to change to my Desktop directory.

I put the shopt command and the various export commands in my .bashrc file.

[Feb 04, 2017] Copy file into multiple directories

Feb 04, 2017 |
Instead of running:
cp /path/to/file /usr/dir1
cp /path/to/file /var/dir2
cp /path/to/file /nas/dir3

Run the following command to copy file into multiple dirs:

echo /usr/dir1 /var/dir2 /nas/dir3 | xargs -n 1 cp -v /path/to/file

[Feb 04, 2017] 20 Unix Command Line Tricks – Part I

Feb 04, 2017 |
Locking a directory

For privacy of my data I wanted to lock down /downloads on my file server. So I ran:


chmod 0000 /downloads

The root user can still has access and ls and cd commands will not work. To go back:


chmod 0755 /downloads Clear gibberish all over the screen

Just type:


reset Becoming human

Pass the -h or -H (and other options) command line option to GNU or BSD utilities to get output of command commands like ls, df, du, in human-understandable formats:

# print sizes in human readable format (e.g., 1K 234M 2G)
# show output in bytes, KB, MB, or GB
# print sizes in human readable format (e.g., 1K 234M 2G)
# get file system perms in human readable format
# compare human readable numbers
# display the CPU information in human readable format on a Linux


# Show the  size of each file but in a more human readable way

ls -lh # print sizes in human readable format (e.g., 1K 234M 2G) df -h df -k # show output in bytes, KB, MB, or GB free -b free -k free -m free -g # print sizes in human readable format (e.g., 1K 234M 2G) du -h # get file system perms in human readable format stat -c %A /boot # compare human readable numbers sort -h -a file # display the CPU information in human readable format on a Linux lscpu lscpu -e lscpu -e=cpu,node # Show the size of each file but in a more human readable way tree -h tree -h /boot Show information about known users in the Linux based system

Just type:

## linux version ##

## BSD version ##


## linux version ## lslogins## BSD version ## logins

Sample outputs:

  0 root             0        0   22:37:59 root
  1 bin              0        1            bin
  2 daemon           0        1            daemon
  3 adm              0        1            adm
  4 lp               0        1            lp
  5 sync             0        1            sync
  6 shutdown         0        1 2014-Dec17 shutdown
  7 halt             0        1            halt
  8 mail             0        1            mail
 10 uucp             0        1            uucp
 11 operator         0        1            operator
 12 games            0        1            games
 13 gopher           0        1            gopher
 14 ftp              0        1            FTP User
 27 mysql            0        1            MySQL Server
 38 ntp              0        1            
 48 apache           0        1            Apache
 68 haldaemon        0        1            HAL daemon
 69 vcsa             0        1            virtual console memory owner
 72 tcpdump          0        1            
 74 sshd             0        1            Privilege-separated SSH
 81 dbus             0        1            System message bus
 89 postfix          0        1            
 99 nobody           0        1            Nobody
173 abrt             0        1            
497 vnstat           0        1            vnStat user
498 nginx            0        1            nginx user
499 saslauth         0        1            "Saslauthd user"
Confused on a top command output?

Seriously, you need to try out htop instead of top:


sudo htop Want to run the same command again?

Just type !! . For example:

name arg1 arg2
# To run the same command again 

## To run the last command again as root user

/myhome/dir/script/name arg1 arg2# To run the same command again !!## To run the last command again as root user sudo !!

The !! repeats the most recent command. To run the most recent command beginning with "foo":

# Run the most recent command beginning with "service" as root

!foo # Run the most recent command beginning with "service" as root sudo !service

The !$ use to run command with the last argument of the most recent command:

# Edit nginx.conf
# Test nginx.conf for errors
# After testing a file with "/sbin/nginx -t -c /etc/nginx/nginx.conf", you
# can edit file again with vi

# Edit nginx.conf sudo vi /etc/nginx/nginx.conf# Test nginx.conf for errors /sbin/nginx -t -c /etc/nginx/nginx.conf# After testing a file with "/sbin/nginx -t -c /etc/nginx/nginx.conf", you # can edit file again with vi sudo vi !$ Get a reminder you when you have to leave

If you need a reminder to leave your terminal, type the following command:

leave +hhmm

leave +hhmm


  • hhmm – The time of day is in the form hhmm where hh is a time in hours (on a 12 or 24 hour clock), and mm are minutes. All times are converted to a 12 hour clock, and assumed to be in the next 12 hours.
Home sweet home

Want to go the directory you were just in? Run:
cd -
Need to quickly return to your home directory? Enter:
The variable CDPATH defines the search path for the directory containing directories:


export CDPATH=/var/www:/nas10

Now, instead of typing cd /var/www/html/ I can simply type the following to cd into /var/www/html path:


cd html Editing a file being viewed with less pager

To edit a file being viewed with less pager, press v . You will have the file for edit under $EDITOR:

## Press v to edit file ##
## Quit from editor and you would return to the less pager again ##

less *.c less foo.html ## Press v to edit file ## ## Quit from editor and you would return to the less pager again ## List all files or directories on your system

To see all of the directories on your system, run:


# List all directories in your $HOME

find / -type d | less# List all directories in your $HOME find $HOME -type d -ls | less

To see all of the files, run:


# List all files in your $HOME

find / -type f | less# List all files in your $HOME find $HOME -type f -ls | less Build directory trees in a single command

You can create directory trees one at a time using mkdir command by passing the -p option:


mkdir -p /jail/{dev,bin,sbin,etc,usr,lib,lib64} ls -l /jail/ Copy file into multiple directories

Instead of running:


cp /path/to/file /usr/dir1 cp /path/to/file /var/dir2 cp /path/to/file /nas/dir3

Run the following command to copy file into multiple dirs:


echo /usr/dir1 /var/dir2 /nas/dir3 | xargs -n 1 cp -v /path/to/file

Creating a shell function is left as an exercise for the reader

Quickly find differences between two directories

The diff command compare files line by line. It can also compare two directories:

# Compare two folders using diff ##

[Feb 04, 2017] List all files or directories on your system

Feb 04, 2017 |
List all files or directories on your system

To see all of the directories on your system, run:


# List all directories in your $HOME

find / -type d | less# List all directories in your $HOME find $HOME -type d -ls | less

To see all of the files, run:


# List all files in your $HOME

[Nov 04, 2016] Coding Style rear-rear Wiki

Reading rear sources is an interesting exercise. It really demonstrates attempt to use "reasonable' style of shell programming and you can learn a lot.
Nov 04, 2016 |

Relax-and-Recover is written in Bash (at least bash version 3 is needed), a language that can be used in many styles. We want to make it easier for everybody to understand the Relax-and-Recover code and subsequently to contribute fixes and enhancements.

Here is a collection of coding hints that should help to get a more consistent code base.

Don't be afraid to contribute to Relax-and-Recover even if your contribution does not fully match all this coding hints. Currently large parts of the Relax-and-Recover code are not yet in compliance with this coding hints. This is an ongoing step by step process. Nevertheless try to understand the idea behind this coding hints so that you know how to break them properly (i.e. "learn the rules so you know how to break them properly").

The overall idea behind this coding hints is:

Make yourself understood

Make yourself understood to enable others to fix and enhance your code properly as needed.

From this overall idea the following coding hints are derived.

For the fun of it an extreme example what coding style should be avoided:

#!/bin/bash for i in `seq 1 2 $((2*$1-1))`;do echo $((j+=i));done


Try to find out what that code is about - it does a useful thing.

Code must be easy to read Code should be easy to understand

Do not only tell what the code does (i.e. the implementation details) but also explain what the intent behind is (i.e. why ) to make the code maintainable.

Here the initial example so that one can understand what it is about:

#!/bin/bash # output the first N square numbers # by summing up the first N odd numbers 1 3 ... 2*N-1 # where each nth partial sum is the nth square number # see # this way it is a little bit faster for big N compared to # calculating each square number on its own via multiplication N=$1 if ! [[ $N =~ ^[0-9]+$ ]] ; then echo "Input must be non-negative integer." 1>&2 exit 1 fi square_number=0 for odd_number in $( seq 1 2 $(( 2 * N - 1 )) ) ; do (( square_number += odd_number )) && echo $square_number done

Now the intent behind is clear and now others can easily decide if that code is really the best way to do it and easily improve it if needed.

Try to care about possible errors

By default bash proceeds with the next command when something failed. Do not let your code blindly proceed in case of errors because that could make it hard to find the root cause of a failure when it errors out somewhere later at an unrelated place with a weird error message which could lead to false fixes that cure only a particular symptom but not the root cause.

Maintain Backward Compatibility

Implement adaptions and enhancements in a backward compatible way so that your changes do not cause regressions for others.

Dirty hacks welcome

When there are special issues on particular systems it is more important that the Relax-and-Recover code works than having nice looking clean code that sometimes fails. In such special cases any dirty hacks that intend to make it work everywhere are welcome. But for dirty hacks the above listed coding hints become mandatory rules:

For example a dirty hack like the following is perfectly acceptable:

# FIXME: Dirty hack to make it work # on "FUBAR Linux version 666" # where COMMAND sometimes inexplicably fails # but always works after at most 3 attempts # see # Retries should have no bad effect on other systems # where the first run of COMMAND works. COMMAND || COMMAND || COMMAND || Error "COMMAND failed."

Character Encoding

Use only traditional (7-bit) ASCII charactes. In particular do not use UTF-8 encoded multi-byte characters.

Text Layout Variables Functions Relax-and-Recover functions

Use the available Relax-and-Recover functions when possible instead of re-implementing basic functionality again and again. The Relax-and-Recover functions are implemented in various lib/* files .

test, [, [[, (( Paired parenthesis See also

[Dec 06, 2015] Bash For Loop Examples

A very nice tutorial by Vivek Gite (created October 31, 2008 last updated June 24, 2015). His mistake is putting new for loop too far inside the tutorial. It should emphazied, not hidden.
June 24, 2015 |

... ... ...

Bash v4.0+ has inbuilt support for setting up a step value using {START..END..INCREMENT} syntax:

echo "Bash version ${BASH_VERSION}..."
for i in {0..10..2}
     echo "Welcome $i times"

Sample outputs:

Bash version 4.0.33(0)-release...
Welcome 0 times
Welcome 2 times
Welcome 4 times
Welcome 6 times
Welcome 8 times
Welcome 10 times

... ... ...

Three-expression bash for loops syntax

This type of for loop share a common heritage with the C programming language. It is characterized by a three-parameter loop control expression; consisting of an initializer (EXP1), a loop-test or condition (EXP2), and a counting expression (EXP3).

for (( EXP1; EXP2; EXP3 ))

A representative three-expression example in bash as follows:

for (( c=1; c<=5; c++ ))
   echo "Welcome $c times"
... ... ...

Jadu Saikia, November 2, 2008, 3:37 pm

Nice one. All the examples are explained well, thanks Vivek.

seq 1 2 20
output can also be produced using jot

jot – 1 20 2

The infinite loops as everyone knows have the following alternatives.

while :


Andi Reinbrech, November 18, 2010, 7:42 pm
I know this is an ancient thread, but thought this trick might be helpful to someone:

For the above example with all the cuts, simply do

set `echo $line`

This will split line into positional parameters and you can after the set simply say

F1=$1; F2=$2; F3=$3

I used this a lot many years ago on solaris with "set `date`", it neatly splits the whole date string into variables and saves lots of messy cutting :-)

… no, you can't change the FS, if it's not space, you can't use this method

Peko, July 16, 2009, 6:11 pm
Hi Vivek,
Thanks for this a useful topic.

IMNSHO, there may be something to modify here
Latest bash version 3.0+ has inbuilt support for setting up a step value:

for i in {1..5}
1) The increment feature seems to belong to the version 4 of bash.
Accordingly, my bash v3.2 does not include this feature.

BTW, where did you read that it was 3.0+ ?
(I ask because you may know some good website of interest on the subject).

2) The syntax is {} where from, to, step are 3 integers.
You code is missing the increment.

Note that GNU Bash documentation may be bugged at this time,
because on GNU Bash manual, you will find the syntax {x..y[incr]}
which may be a typo. (missing the second ".." between y and increment).


The Bash Hackers page
again, see
seeems to be more accurate,
but who knows ? Anyway, at least one of them may be right… ;-)

Keep on the good work of your own,
Thanks a million.

- Peko

Michal Kaut July 22, 2009, 6:12 am

is there a simple way to control the number formatting? I use several computers, some of which have non-US settings with comma as a decimal point. This means that
for x in $(seq 0 0.1 1) gives 0 0.1 0.2 … 1 one some machines and 0 0,1 0,2 … 1 on other.
Is there a way to force the first variant, regardless of the language settings? Can I, for example, set the keyboard to US inside the script? Or perhaps some alternative to $x that would convert commas to points?
(I am sending these as parameters to another code and it won't accept numbers with commas…)

The best thing I could think of is adding x=`echo $x | sed s/,/./` as a first line inside the loop, but there should be a better solution? (Interestingly, the sed command does not seem to be upset by me rewriting its variable.)


Peko July 22, 2009, 7:27 am

To Michal Kaut:

Hi Michal,

Such output format is configured through LOCALE settings.

I tried :

export LC_CTYPE="en_EN.UTF-8″; seq 0 0.1 1

and it works as desired.

You just have to find the exact value for LC_CTYPE that fits to your systems and your needs.


Peko July 22, 2009, 2:29 pm

To Michal Kaus [2]

Ooops – ;-)
Instead of LC_CTYPE,
LC_NUMERIC should be more appropriate
(Although LC_CTYPE is actually yielding to the same result – I tested both)

By the way, Vivek has already documented the matter :

Philippe Petrinko October 30, 2009, 8:35 am

To Vivek:
Regarding your last example, that is : running a loop through arguments given to the script on the command line, there is a simplier way of doing this:
# instead of:
# FILES="$@"
# for f in $FILES

# use the following syntax
for arg
# whatever you need here – try : echo "$arg"

Of course, you can use any variable name, not only "arg".

Philippe Petrinko November 11, 2009, 11:25 am

To tdurden:

Why would'nt you use

1) either a [for] loop
for old in * ; do mv ${old} ${old}.new; done

2) Either the [rename] command ?
excerpt form "man rename" :

RENAME(1) Perl Programmers Reference Guide RENAME(1)

rename – renames multiple files

rename [ -v ] [ -n ] [ -f ] perlexpr [ files ]

"rename" renames the filenames supplied according to the rule specified
as the first argument. The perlexpr argument is a Perl expression
which is expected to modify the $_ string in Perl for at least some of
the filenames specified. If a given filename is not modified by the
expression, it will not be renamed. If no filenames are given on the
command line, filenames will be read via standard input.

For example, to rename all files matching "*.bak" to strip the
extension, you might say

rename 's/\.bak$//' *.bak

To translate uppercase names to lower, you'd use

rename 'y/A-Z/a-z/' *

- Philippe

Philippe Petrinko November 11, 2009, 9:27 pm

If you set the shell option extglob, Bash understands some more powerful patterns. Here, a is one or more pattern, separated by the pipe-symbol (|).

?() Matches zero or one occurrence of the given patterns
*() Matches zero or more occurrences of the given patterns
+() Matches one or more occurrences of the given patterns
@() Matches one of the given patterns
!() Matches anything except one of the given patterns


Philippe Petrinko November 12, 2009, 3:44 pm

To Sean:
Right, the more sharp a knife is, the easier it can cut your fingers…

I mean: There are side-effects to the use of file globbing (like in [ for f in * ] ) , when the globbing expression matches nothing: the globbing expression is not susbtitued.

Then you might want to consider using [ nullglob ] shell extension,
to prevent this.

Devil hides in detail ;-)

Dominic January 14, 2010, 10:04 am

There is an interesting difference between the exit value for two different for looping structures (hope this comes out right):
for (( c=1; c<=2; c++ )) do echo -n "inside (( )) loop c is $c, "; done; echo "done (( )) loop c is $c"
for c in {1..2}; do echo -n "inside { } loop c is $c, "; done; echo "done { } loop c is $c"

You see that the first structure does a final increment of c, the second does not. The first is more useful IMO because if you have a conditional break in the for loop, then you can subsequently test the value of $c to see if the for loop was broken or not; with the second structure you can't know whether the loop was broken on the last iteration or continued to completion.

Dominic January 14, 2010, 10:09 am

sorry, my previous post would have been clearer if I had shown the output of my code snippet, which is:
inside (( )) loop c is 1, inside (( )) loop c is 2, done (( )) loop c is 3
inside { } loop c is 1, inside { } loop c is 2, done { } loop c is 2

Philippe Petrinko March 9, 2010, 2:34 pm


And, again, as stated many times up there, using [seq] is counter productive, because it requires a call to an external program, when you should Keep It Short and Simple, using only bash internals functions:

for ((c=1; c<21; c+=2)); do echo "Welcome $c times" ; done

(and I wonder why Vivek is sticking to that old solution which should be presented only for historical reasons when there was no way of using bash internals.
By the way, this historical recall should be placed only at topic end, and not on top of the topic, which makes newbies sticking to the not-up-to-date technique ;-) )

Sean March 9, 2010, 11:15 pm

I have a comment to add about using the builtin for (( … )) syntax. I would agree the builtin method is cleaner, but from what I've noticed with other builtin functionality, I had to check the speed advantage for myself. I wrote the following files:

for ((i=1;i<=1000000;i++))
echo "Output $i"

for i in $(seq 1 1000000)
echo "Output $i"

And here were the results that I got:
time ./
real 0m22.122s
user 0m18.329s
sys 0m3.166s

time ./
real 0m19.590s
user 0m15.326s
sys 0m2.503s

The performance increase isn't too significant, especially when you are probably going to be doing something a little more interesting inside of the for loop, but it does show that builtin commands are not necessarily faster.

Andi Reinbrech November 18, 2010, 8:35 pm

The reason why the external seq is faster, is because it is executed only once, and returns a huge splurb of space separated integers which need no further processing, apart from the for loop advancing to the next one for the variable substitution.

The internal loop is a nice and clean/readable construct, but it has a lot of overhead. The check expression is re-evaluated on every iteration, and a variable on the interpreter's heap gets incremented, possibly checked for overflow etc. etc.

Note that the check expression cannot be simplified or internally optimised by the interpreter because the value may change inside the loop's body (yes, there are cases where you'd want to do this, however rare and stupid they may seem), hence the variables are volatile and get re-evaluted.

I.e. botom line, the internal one has more overhead, the "seq" version is equivalent to either having 1000000 integers inside the script (hard coded), or reading once from a text file with 1000000 integers with a cat. Point being that it gets executed only once and becomes static.

OK, blah blah fishpaste, past my bed time :-)


Anthony Thyssen June 4, 2010, 6:53 am

The {1..10} syntax is pretty useful as you can use a variable with it!

echo {1..${limit}}

You need to eval it to get it to work!

eval "echo {1..${limit}}"
1 2 3 4 5 6 7 8 9 10

'seq' is not avilable on ALL system (MacOSX for example)
and BASH is not available on all systems either.

You are better off either using the old while-expr method for computer compatiblity!

   limit=10; n=1;
   while [ $n -le 10 ]; do
     echo $n;
     n=`expr $n + 1`;

Alternativally use a seq() function replacement…

 # seq_count 10
seq_count() {
  i=1; while [ $i -le $1 ]; do echo $i; i=`expr $i + 1`; done
# simple_seq 1 2 10
simple_seq() {
  i=$1; while [ $i -le $3 ]; do echo $i; i=`expr $i + $2`; done
seq_integer() {
    if [ "X$1" = "X-f" ]
    then format="$2"; shift; shift
    else format="%d"
    case $# in
    1) i=1 inc=1 end=$1 ;;
    2) i=$1 inc=1 end=$2 ;;
    *) i=$1 inc=$2 end=$3 ;;
    while [ $i -le $end ]; do
      printf "$format\n" $i;
      i=`expr $i + $inc`;

Edited: by Admin – added code tags.

TheBonsai June 4, 2010, 9:57 am

The Bash C-style for loop was taken from KSH93, thus I guess it's at least portable towards Korn and Z.

The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX.

Philippe Petrinko June 4, 2010, 10:15 am

Right Bonsai,
( )

But FOR C-style does not seem to be POSIXLY-correct…

Read on-line reference issue 6/2004,
Top is here,

and the Shell and Utilities volume (XCU) T.OC. is here
doc is:

and FOR command:

Anthony Thyssen June 6, 2010, 7:18 am

TheBonsai wrote…. "The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX."

I am not certain it is in Posix. It was NOT part of the original Bourne Shell, and on some machines, I deal with Bourne Shell. Not Ksh, Bash, or anything else.

Bourne Shell syntax works everywhere! But as 'expr' is a builtin in more modern shells, then it is not a big loss or slow down.

This is especially important if writing a replacement command, such as for "seq" where you want your "just-paste-it-in" function to work as widely as possible.

I have been shell programming pretty well all the time since 1988, so I know what I am talking about! Believe me.

MacOSX has in this regard been the worse, and a very big backward step in UNIX compatibility. 2 year after it came out, its shell still did not even understand most of the normal 'test' functions. A major pain to write shells scripts that need to also work on this system.

TheBonsai June 6, 2010, 12:35 pm

Yea, the question was if it's POSIX, not if it's 100% portable (which is a difference). The POSIX base more or less is a subset of the Korn features (88, 93), pure Bourne is something "else", I know. Real portability, which means a program can go wherever UNIX went, only in C ;)

Philippe Petrinko November 22, 2010, 8:23 am

And if you want to get rid of double-quotes, use:

one-liner code:
while read; do record=${REPLY}; echo ${record}|while read -d ","; do field="${REPLY#\"}"; field="${field%\"}"; echo ${field}; done; done<data

script code, added of some text to better see record and field breakdown:

while read
echo "New record"
echo ${record}|while read -d ,
echo "Field is :${field}:"

Does it work with your data?

- PP

Philippe Petrinko November 22, 2010, 9:01 am

Of course, all the above code was assuming that your CSV file is named "data".

If you want to use anyname with the script, replace:




And then use your script file (named for instance "myScript") with standard input redirection:

myScript < anyFileNameYouWant


Philippe Petrinko November 22, 2010, 11:28 am

well no there is a bug, last field of each record is not read – it needs a workout and may be IFS modification ! After all that's what it was built for… :O)

Anthony Thyssen November 22, 2010, 11:31 pm

Another bug is the inner loop is a pipeline, so you can't assign variables for use later in the script. but you can use '<<<' to break the pipeline and avoid the echo.

But this does not help when you have commas within the quotes! Which is why you needed quotes in the first place.

In any case It is a little off topic. Perhaps a new thread for reading CVS files in shell should be created.

Philippe Petrinko November 24, 2010, 6:29 pm

Would you try this one-liner script on your CSV file?

This one-liner assumes that CSV file named [data] has __every__ field double-quoted.

while read; do r="${REPLY#\"}";echo "${r//\",\"/\"}"|while read -d \";do echo "Field is :${REPLY}:";done;done<data

Here is the same code, but for a script file, not a one-liner tweak.

# script
# 1) Usage
# This script reads from standard input
# any CSV with double-quoted data fields
# and breaks down each field on standard output
# 2) Within each record (line), _every_ field MUST:
# - Be surrounded by double quotes,
# - and be separated from preceeding field by a comma
# (not the first field of course, no comma before the first field)
while read
echo "New record" # this is not mandatory-just for explanation
# store REPLY and remove opening double quote
# replace every "," by a single double quote
echo ${record}|while read -d \"
# store REPLY into variable "field"
echo "Field is :${field}:" # just for explanation

This script named here [] must be used so: < my-cvs-file-with-doublequotes

Philippe Petrinko November 24, 2010, 6:35 pm


By the way, using [REPLY] in the outer loop _and_ the inner loop is not a bug.
As long as you know what you do, this is not problem, you just have to store [REPLY] value conveniently, as this script shows.

TheBonsai March 8, 2011, 6:26 am
for ((i=1; i<=20; i++)); do printf "%02d\n" "$i"; done

nixCraft March 8, 2011, 6:37 am

+1 for printf due to portability, but you can use bashy .. syntax too

for i in {01..20}; do echo "$i"; done

TheBonsai March 8, 2011, 6:48 am

Well, it isn't portable per se, it makes it portable to pre-4 Bash versions.

I think a more or less "portable" (in terms of POSIX, at least) code would be

while [ "$((i >= 20))" -eq 0 ]; do
  printf "%02d\n" "$i"

Philip Ratzsch April 20, 2011, 5:53 am

I didn't see this in the article or any of the comments so I thought I'd share. While this is a contrived example, I find that nesting two groups can help squeeze a two-liner (once for each range) into a one-liner:

for num in {{1..10},{15..20}};do echo $num;done

Great reference article!

Philippe Petrinko April 20, 2011, 8:23 am

Nice thing to think of, using brace nesting, thanks for sharing.

Philippe Petrinko May 6, 2011, 10:13 am

Hello Sanya,

That would be because brace expansion does not support variables. I have to check this.
Anyway, Keep It Short and Simple: (KISS) here is a simple solution I already gave above:

for (( x = $xstart; x <= $xend; x += $xstep)); do echo $x;done

Actually, POSIX compliance allows to forget $ in for quotes, as said before, you could also write:

for (( x = xstart; x <= xend; x += xstep)); do echo $x;done

Philippe Petrinko May 6, 2011, 10:48 am


Actually brace expansion happens __before__ $ parameter exapansion, so you cannot use it this way.

Nevertheless, you could overcome this this way:

max=10; for i in $(eval echo {1..$max}); do echo $i; done

Sanya May 6, 2011, 11:42 am

Hello, Philippe

Thanks for your suggestions
You basically confirmed my findings, that bash constructions are not as simple as zsh ones.
But since I don't care about POSIX compliance, and want to keep my scripts "readable" for less experienced people, I would prefer to stick to zsh where my simple for-loop works

Cheers, Sanya

Philippe Petrinko May 6, 2011, 12:07 pm


First, you got it wrong: solutions I gave are not related to POSIX, I just pointed out that POSIX allows not to use $ in for (( )), which is just a little bit more readable – sort of.

Second, why do you see this less readable than your [zsh] [for loop]?

for (( x = start; x <= end; x += step)) do
echo "Loop number ${x}"

It is clear that it is a loop, loop increments and limits are clear.

IMNSHO, if anyone cannot read this right, he should not be allowed to code. :-D


Anthony Thyssen May 8, 2011, 11:30 pm

If you are going to do… $(eval echo {1..$max});
You may as well use "seq" or one of the many other forms.
See all the other comments on doing for loops.

Tom P May 19, 2011, 12:16 pm

I am trying to use the variable I set in the for line on to set another variable with a different extension. Couldn't get this to work and couldnt find it anywhere on the web… Can someone help.


FILE_TOKEN=`cat /tmp/All_Tokens.txt`
for token in $FILE_TOKEN
A1_$token=`grep $A1_token /file/path/file.txt | cut -d ":" -f2`

my goal is to take the values from the ALL Tokens file and set a new variable with A1_ infront of it… This tells be that A1_ is not a command…

[Jun 12, 2010] Writing Robust Bash Shell Scripts

Many people hack together shell scripts quickly to do simple tasks, but these soon take on a life of their own. Unfortunately shell scripts are full of subtle effects which result in scripts failing in unusual ways. It's possible to write scripts which minimise these problems. In this article, I explain several techniques for writing robust bash scripts.

Use set -u

How often have you written a script that broke because a variable wasn't set? I know I have, many times.

rm -rf $chroot/usr/share/doc 

If you ran the script above and accidentally forgot to give a parameter, you would have just deleted all of your system documentation rather than making a smaller chroot. So what can you do about it? Fortunately bash provides you with set -u, which will exit your script if you try to use an uninitialised variable. You can also use the slightly more readable set -o nounset.

david% bash /tmp/
david% bash -u /tmp/
/tmp/ line 3: $1: unbound variable

Use set -e

Every script you write should include set -e at the top. This tells bash that it should exit the script if any statement returns a non-true return value. The benefit of using -e is that it prevents errors snowballing into serious issues when they could have been caught earlier. Again, for readability you may want to use set -o errexit.

Using -e gives you error checking for free. If you forget to check something, bash will do it or you. Unfortunately it means you can't check $? as bash will never get to the checking code if it isn't zero. There are other constructs you could use:

if [ "$?"-ne 0]; then echo "command failed"; exit 1; fi 

could be replaced with

command || { echo "command failed"; exit 1; } 


if ! command; then echo "command failed"; exit 1; fi 

What if you have a command that returns non-zero or you are not interested in its return value? You can use command || true, or if you have a longer section of code, you can turn off the error checking, but I recommend you use this sparingly.

set +e
set -e 

On a slightly related note, by default bash takes the error status of the last item in a pipeline, which may not be what you want. For example, false | true will be considered to have succeeded. If you would like this to fail, then you can use set -o pipefail to make it fail.

Program defensively - expect the unexpected

Your script should take into account of the unexpected, like files missing or directories not being created. There are several things you can do to prevent errors in these situations. For example, when you create a directory, if the parent directory doesn't exist, mkdir will return an error. If you add a -p option then mkdir will create all the parent directories before creating the requested directory. Another example is rm. If you ask rm to delete a non-existent file, it will complain and your script will terminate. (You are using -e, right?) You can fix this by using -f, which will silently continue if the file didn't exist.

Be prepared for spaces in filenames

Someone will always use spaces in filenames or command line arguments and you should keep this in mind when writing shell scripts. In particular you should use quotes around variables.

if [ $filename = "foo" ]; 

will fail if $filename contains a space. This can be fixed by using:

if [ "$filename" = "foo" ]; 

When using $@ variable, you should always quote it or any arguments containing a space will be expanded in to separate words.

david% foo() { for i in $@; do echo $i; done }; foo bar "baz quux"
david% foo() { for i in "$@"; do echo $i; done }; foo bar "baz quux"
baz quux 

I can not think of a single place where you shouldn't use "$@" over $@, so when in doubt, use quotes.

If you use find and xargs together, you should use -print0 to separate filenames with a null character rather than new lines. You then need to use -0 with xargs.

david% touch "foo bar"
david% find | xargs ls
ls: ./foo: No such file or directory
ls: bar: No such file or directory
david% find -print0 | xargs -0 ls
./foo bar 

Setting traps

Often you write scripts which fail and leave the filesystem in an inconsistent state; things like lock files, temporary files or you've updated one file and there is an error updating the next file. It would be nice if you could fix these problems, either by deleting the lock files or by rolling back to a known good state when your script suffers a problem. Fortunately bash provides a way to run a command or function when it receives a unix signal using the trap command.

trap command signal [signal ...]

There are many signals you can trap (you can get a list of them by running kill -l), but for cleaning up after problems there are only 3 we are interested in: INT, TERM and EXIT. You can also reset traps back to their default by using - as the command.

Signal Description
INT Interrupt - This signal is sent when someone kills the script by pressing ctrl-c.
TERM Terminate - this signal is sent when someone sends the TERM signal using the kill command.
EXIT Exit - this is a pseudo-signal and is triggered when your script exits, either through reaching the end of the script, an exit command or by a command failing when using set -e.

Usually, when you write something using a lock file you would use something like:

if [ ! -e $lockfile ]; then
   touch $lockfile
   rm $lockfile
   echo "critical-section is already running"

What happens if someone kills your script while critical-section is running? The lockfile will be left there and your script won't run again until it's been deleted. The fix is to use:

if [ ! -e $lockfile ]; then
   trap "rm -f $lockfile; exit" INT TERM EXIT
   touch $lockfile
   rm $lockfile
   trap - INT TERM EXIT
   echo "critical-section is already running"

Now when you kill the script it will delete the lock file too. Notice that we explicitly exit from the script at the end of trap command, otherwise the script will resume from the point that the signal was received.

Race conditions

It's worth pointing out that there is a slight race condition in the above lock example between the time we test for the lockfile and the time we create it. A possible solution to this is to use IO redirection and bash's noclobber mode, which won't redirect to an existing file. We can use something similar to:

if ( set -o noclobber; echo "$$" > "$lockfile") 2> /dev/null; 
   trap 'rm -f "$lockfile"; exit $?' INT TERM EXIT

   rm -f "$lockfile"
   trap - INT TERM EXIT
   echo "Failed to acquire lockfile: $lockfile." 
   echo "Held by $(cat $lockfile)"

A slightly more complicated problem is where you need to update a bunch of files and need the script to fail gracefully if there is a problem in the middle of the update. You want to be certain that something either happened correctly or that it appears as though it didn't happen at all.Say you had a script to add users.

add_to_passwd $user
cp -a /etc/skel /home/$user
chown $user /home/$user -R

There could be problems if you ran out of diskspace or someone killed the process. In this case you'd want the user to not exist and all their files to be removed.

rollback() {
   del_from_passwd $user
   if [ -e /home/$user ]; then
      rm -rf /home/$user

trap rollback INT TERM EXIT
add_to_passwd $user
cp -a /etc/skel /home/$user
chown $user /home/$user -R

We needed to remove the trap at the end or the rollback function would have been called as we exited, undoing all the script's hard work.

Be atomic

Sometimes you need to update a bunch of files in a directory at once, say you need to rewrite urls form one host to another on your website. You might write:

for file in $(find /var/www -type f -name "*.html"); do
   perl -pi -e 's/' $file

Now if there is a problem with the script you could have half the site referring to and the rest referring to You could fix this using a backup and a trap, but you also have the problem that the site will be inconsistent during the upgrade too.

The solution to this is to make the changes an (almost) atomic operation. To do this make a copy of the data, make the changes in the copy, move the original out of the way and then move the copy back into place. You need to make sure that both the old and the new directories are moved to locations that are on the same partition so you can take advantage of the property of most unix filesystems that moving directories is very fast, as they only have to update the inode for that directory.

cp -a /var/www /var/www-tmp
for file in $(find /var/www-tmp -type f -name "*.html"); do
   perl -pi -e 's/' $file
mv /var/www /var/www-old
mv /var/www-tmp /var/www 

This means that if there is a problem with the update, the live system is not affected. Also the time where it is affected is reduced to the time between the two mvs, which should be very minimal, as the filesystem just has to change two entries in the inodes rather than copying all the data around.

The disadvantage of this technique is that you need to use twice as much disk space and that any process that keeps files open for a long time will still have the old files open and not the new ones, so you would have to restart those processes if this is the case. In our example this isn't a problem as apache opens the files every request. You can check for files with files open by using lsof. An advantage is that you now have a backup before you made your changes in case you need to revert.

[May 20, 2010] Bash 4.2 news

Lastpipe shell option that fix longstanding bash stupidity (bug that became a feature) was long overdue. Negative indexes semantic borrowed from Perl is also nice.
m.  The printf builtin has a new %(fmt)T specifier, which allows time values
    to use strftime-like formatting.

n.  There is a new `compat41' shell option.

o.  The cd builtin has a new Posix-mandated `-e' option.

p.  Negative subscripts to indexed arrays, previously errors, now are treated
    as offsets from the maximum assigned index + 1.

q.  Negative length specifications in the ${var:offset:length} expansion,
    previously errors, are now treated as offsets from the end of the variable.

... ... ...

t.  There is a new `lastpipe' shell option that runs the last command of a
    pipeline in the current shell context.  The lastpipe option has no
    effect if job control is enabled.

This is a terse description of the new features added to bash-4.1 since
the release of bash-4.0.  As always, the manual page (doc/bash.1) is
the place to look for complete descriptions.

e.  `printf -v' can now assign values to array indices.

f.  New `complete -E' and `compopt -E' options that work on the "empty"
    completion: completion attempted on an empty command line.

g.  New complete/compgen/compopt -D option to define a `default' completion:
    a completion to be invoked on command for which no completion has been
    defined.  If this function returns 124, programmable completion is
    attempted again, allowing a user to dynamically build a set of completions
    as completion is attempted by having the default completion function
    install individual completion functions each time it is invoked.

h.  When displaying associative arrays, subscripts are now quoted.

i.  Changes to dabbrev-expand to make it more `emacs-like': no space appended
    after matches, completions are not sorted, and most recent history entries
    are presented first.

j.  The [[ and (( commands are now subject to the setting of `set -e' and the
    ERR trap.

... ... ... 

q.  The < and > operators to the [[ conditional command now do string
    comparison according to the current locale if the compatibility level
    is greater than 40.

[May 20, 2010] SFR Fresh bash-4.1.tar.gz (Download & Source Code Browsing)

Bash 4 - a rough overview [Bash Hackers Wiki]

The new "coproc" keyword

Bash 4 introduces the concepts of coprocesses, a well known feature in other shells. The basic concept is simple: It will start any command in the background and set up an array that is populated with accessible files that represent the filedescriptors of the started process.

In other words: It lets you start a process in background and communicate with its input and output data streams.

See The coproc keyword

The new "mapfile" builtin

The mapfile builtin is able to map the lines of a file directly into an array. This avoids to fill an array yourself using a loop. It allows to define the range of lines to read and optionally calling a callback, for example to display a progress bar.

See: The mapfile builtin

Changes to the "case" keyword

The case construct understands two new action list terminators:

The ;& terminator causes execution to continue with the next action list (rather than terminate the case construct).

The ;;& terminator causes the case construct to test the next given pattern instead of terminating the whole execution.

See The case statement

Changes to the "declare" builtin

The -p option now prints all attributes and values of declared variables (or functions, when used with -f). The output is fully re-usable as input.

The new option -l declares a variable in a way that the content ist converted to lowercase on assignment. Same, but for uppercase, applies to -u. The option -c causes the content to be capitalized before assignment.

declare -A declares associative arrays (see below).

Changes to the "read" builtin

The read builtin command got some interesting new features.

The -t option to specify a timeout value has been slightly tuned. It now accepts fractional values and the special value 0 (zero). When -t 0 is specified, read immediately returns with an exit status indicating if there's data waiting or not. However, when a timeout is given and the read builtin times out, any partial data recieved up to the timeout is stored in the given variable, rather than lost. When a timeout is hit, read exits with a code greater than 128.

A new option, -i, was introduced to be able to preload the input buffer with some text (when Readline is used, with -e). The user is able to change the text or just press return to accept it.

See The read builtin

Changes to the "help" builtin

The builtin itself didn't change much, but the data displayed is more structured now. The help texts are in a better format, much easier to read.

There are two new options: -d displays the summary of a help text, -m displays a manpage-like format.

Changes to the "ulimit" builtin

Beside the use of the 512 bytes blocksize everywhere in POSIX mode, ulimit supports two new limits: -b for max. socket buffer size and -T for max. number of threads.

Brace Expansion

The brace expancion was tuned to provide expansion results with leading zeros when requesting a row of numbers.

See Brace expansion

Parameter Expansion

Methods to modify the case on expansion time have been added.

On expansion time you can modify the syntax by adding operators to the parameter name.

See Case modification on parameter expansion

Substring expansion

When using substring expansion on the positional parameters, a starting index of 0 now causes $0 to be prefixed to the list (if the positional parameters are used at all). Before, this expansion started with $1:
# this should display $0 on Bash v4, $1 on Bash v3
echo ${@:0:1}


There's a new shell option globstar. When enabled, Bash will perform recursive globbing on ** – this means it matches all directories and files from the current position in the filesystem, rather that only the current level.

The new shell option dirspell enables spelling corrections on directory names during globbing.

See Pathname expansion (globbing)

Associative Arrays

Beside the classic method of integer indexed arrays, Bash 4 supports associative arrays.

An associative array is an array indexed by an arbitrary string, something like

declare -A ASSOC

ASSOC[First]="first element"
ASSOC[Hello]="second element"
ASSOC[Peter Pan]="A weird guy"

See Arrays


There is a new &>> redirection operator, which appends the standard output and standard error to the named file. This is the same as the good old >>FILE 2>&1 notation.

The parser now understands |&2>&1 |, which redirects the standard error for a command through a pipe.

See Redirection

Interesting new shell variables

Variable Description
BASHPID contains the PID of the current shell (this is different to what $$ does!)
PROMPT_DIRTRIM specifies the max. level of unshortened pathname elements in the prompt

See Special parameters and shell variables

Interesting new Shell Options

The mentioned shell options are off by default unless otherwise mentioned.
Option Description
checkjobs check for and report any running jobs at shell exit
compat* set compatiblity modes for older shell versions (influences regular expression matching in [[ ... ]]
dirspell enables spelling corrections on directory names during globbing
globstar enables recursive globbing with **

See List of shell options


[Apr 7, 2009] The GNU bash-4.0

Look more like bash 3.3 then bash 4.0. complete absence of interesting features and very poor understanding of the necessary path of shell development... (just look at William Park's BASFDIFF and other patches and you understand this negative comment better).

The current version of bash is bash-4.0. (GPG signature).

[Apr 6, 2009] Bash Process Substitution by Mitch Frazier

May 22, 2008 Linux Journal

In addition to the fairly common forms of input/output redirection the shell recognizes something called process substitution. Although not documented as a form of input/output redirection, its syntax and its effects are similar.

The syntax for process substitution is:

where each list is a command or a pipeline of commands. The effect of process substitution is to make each list act like a file. This is done by giving the list a name in the file system and then substituting that name in the command line. The list is given a name either by connecting the list to named pipe or by using a file in /dev/fd (if supported by the O/S). By doing this, the command simply sees a file name and is unaware that its reading from or writing to a command pipeline.

To substitute a command pipeline for an input file the syntax is:

  command ... <(list) ...
To substitute a command pipeline for an output file the syntax is:
  command ... >(list) ...

At first process substitution may seem rather pointless, for example you might imagine something simple like:

  uniq <(sort a)
to sort a file and then find the unique lines in it, but this is more commonly (and more conveniently) written as:
  sort a | uniq
The power of process substitution comes when you have multiple command pipelines that you want to connect to a single command.

For example, given the two files:

  # cat a
  # cat b
To view the lines unique to each of these two unsorted files you might do something like this:
  # sort a | uniq >tmp1
  # sort b | uniq >tmp2
  # comm -3 tmp1 tmp2
  # rm tmp1 tmp2
With process substitution we can do all this with one line:
  # comm -3 <(sort a | uniq) <(sort b | uniq)

Depending on your shell settings you may get an error message similar to:

  syntax error near unexpected token `('
when you try to use process substitution, particularly if you try to use it within a shell script. Process substitution is not a POSIX compliant feature and so it may have to be enabled via:
  set +o posix
Be careful not to try something like:
  if [[ $use_process_substitution -eq 1 ]]; then
    set +o posix
    comm -3 <(sort a | uniq) <(sort b | uniq)
The command set +o posix enables not only the execution of process substitution but the recognition of the syntax. So, in the example above the shell tries to parse the process substitution syntax before the "set" command is executed and therefore still sees the process substitution syntax as illegal.

Of course, note that all shells may not support process substitution, these examples will work with bash.

[Apr 5, 2009] Using Named Pipes (FIFOs) with Bash

[Apr 3, 2008] Advanced Bash-Scripting Guide

[Apr 2, 2008] System Administration Toolkit: Get the most out of bash

Good paper !

Ease your system administration tasks by taking advantage of key parts of the Bourne-again shell (bash) and its features. Bash is a popular alternative to the original Bourne and Korn shells. It provides an impressive range of additional functionality that includes improvements to the scripting environment, extensive aliasing techniques, and improved methods for automatically completing different commands, files, and paths.

!!! [Apr 1, 2008] Linux tip: Bash parameters and parameter expansions by Ian Shields

Definitely gifted author !

Do you sometimes wonder how to use parameters with your scripts, and how to pass them to internal functions or other scripts? Do you need to do simple validity tests on parameters or options, or perform simple extraction and replacement operations on the parameter strings? This tip helps you with parameter use and the various parameter expansions available in the bash shell.

3.1-0.09 October 27, 2007

[Mar 14, 2008] Advanced Bash Scripting Guide 5.2 (Stable) by M. Leo Cooper

About: The Advanced Bash Scripting Guide is both a reference and a tutorial on shell scripting. This comprehensive book (the equivalent of 880+ print pages) covers almost every aspect of shell scripting. It contains 340 profusely commented illustrative examples, a number of tables, and a cross-linked index/glossary. Not just a shell scripting tutorial, this book also provides an introduction to basic programming techniques, such as sorting and recursion. It is well suited for either individual study or classroom use. It covers Bash, up to and including version 3.2x.

Changes: Many bugfixes and stylistic cleanups were done. Four new example scripts were added. A new subsection on version 3.2 Bash update was added. Explanations of certain difficult concepts were clarified. This is an important update.

[Mar 8, 2008] BashStyle-NG: A Graphical Tool for styling the BASH version 5.2

BashStyle is a Graphical Tool for changing the Bash's behaviour and look'n'feel.

It has been part of *NixStyle-NG but is now splitt-off for easier maintainance.

It features some predefinied Themes, most of them are re-colorable. The first 4 Extra Options are only for the Separator Style (Theme).

Themes are 100% compatible with the Linux Console. When on it, colors will be deactivated automatically (to avoid ugly look'n'feel).

You can also enable colored manpages or get colored output from grep.


run ./configure to check the dependencies

and run (sudo/su) make install to install BashStyle

then run bashstyle or go to Menu -> Accesoires -> BashStyle

Have Fun!


This application requires GTK+ version 2.10.x. Other dependencies include:
Required: python 2.4+,pygtk 2.4+, pyglade 2.4+, bash 3.0+, sed, coreutils, gconf

Optional: psyco, /dev/random

BASH Frequently-Asked Questions (FAQ version 3.36) - comp.unix.questions Google Groups

Bash 3.2 is available
B1) What's new in version 3.2?

Bash-3.2 is the second maintenance release of the third major release of bash. It contains the following significant new features (see the manual page for complete descriptions and the CHANGES and NEWS files in the bash-3.2 distribution).

  • Bash-3.2 now checks shell scripts for NUL characters rather than non-printing characters when deciding whether or not a script is a binary file.
  • Quoting the string argument to the [[ command's =~ (regexp) operator now forces string matching, as with the other pattern-matching operators.

    A short feature history dating from Bash-2.0:

    Bash-3.1 contained the following new features:

  • Bash-3.1 may now be configured and built in a mode that enforces strict POSIX compliance.
  • The `+=' assignment operator, which appends to the value of a string or array variable, has been implemented.
  • It is now possible to ignore case when matching in contexts other than filename generation using the new `nocasematch' shell option.

    Bash-3.0 contained the following new features:

  • Features to support the bash debugger have been implemented, and there is a new `extdebug' option to turn the non-default options on
  • HISTCONTROL is now a colon-separated list of options and has been extended with a new `erasedups' option that will result in only one copy of a command being kept in the history list
  • Brace expansion has been extended with a new {x..y} form, producing sequences of digits or characters
  • Timestamps are now kept with history entries, with an option to save and restore them from the history file; there is a new HISTTIMEFORMAT variable describing how to display the timestamps when listing history entries
  • The `[[' command can now perform extended regular expression (egrep-like) matching, with matched subexpressions placed in the BASH_REMATCH array variable
  • A new `pipefail' option causes a pipeline to return a failure status if any command in it fails
  • The `jobs', `kill', and `wait' builtins now accept job control notation in their arguments even if job control is not enabled
  • The `gettext' package and libintl have been integrated, and the shell messages may be translated into other languages

    Bash-2.05b introduced the following new features:

  • support for multibyte characters has been added to both bash and readline
  • the DEBUG trap is now run *before* simple commands, ((...)) commands, [[...]] conditional commands, and for ((...)) loops
  • the shell now performs arithmetic in the largest integer size the machine supports (intmax_t)
  • there is a new \D{...} prompt expansion; passes the `...' to strftime(3) and inserts the result into the expanded prompt
  • there is a new `here-string' redirection operator: <<< word
  • when displaying variables, function attributes and definitions are shown separately, allowing them to be re-used as input (attempting to re-use the old output would result in syntax errors).

    o `read' has a new `-u fd' option to read from a specified file descriptor

  • the bash debugger in examples/bashdb has been modified to work with the new DEBUG trap semantics, the command set has been made more gdb-like, and the changes to $LINENO make debugging functions work better
  • the expansion of $LINENO inside a shell function is only relative to the function start if the shell is interactive -- if the shell is running a script, $LINENO expands to the line number in the script. This is as POSIX-2001 requires

    Bash-2.05a introduced the following new features:

  • The `printf' builtin has undergone major work
  • There is a new read-only `shopt' option: login_shell, which is set by login shells and unset otherwise
  • New `\A' prompt string escape sequence; expanding to time in 24-hour HH:MM format
  • New `-A group/-g' option to complete and compgen; goes group name completion
  • New [+-]O invocation option to set and unset `shopt' options at startup
  • ksh-like `ERR' trap

    o `for' loops now allow empty word lists after the `in' reserved word

  • new `hard' and `soft' arguments for the `ulimit' builtin
  • Readline can be configured to place the user at the same point on the line when retrieving commands from the history list
  • Readline can be configured to skip `hidden' files (filenames with a leading `.' on Unix) when performing completion

    Bash-2.05 introduced the following new features:

  • This version has once again reverted to using locales and strcoll(3) when processing pattern matching bracket expressions, as POSIX requires.
  • Added a new `--init-file' invocation argument as a synonym for `--rcfile', per the new GNU coding standards.
  • The /dev/tcp and /dev/udp redirections now accept service names as well as port numbers.<li>`complete' and `compgen' now take a `-o value' option, which controls some of the aspects of that compspec. Valid values are:

    default - perform bash default completion if programmable completion produces no matches dirnames - perform directory name completion if programmable completion produces no matches filenames - tell readline that the compspec produces filenames, so it can do things like append slashes to directory names and suppress trailing spaces

  • A new loadable builtin, realpath, which canonicalizes and expands symlinks in pathname arguments.
  • When `set' is called without options, it prints function defintions in a way that allows them to be reused as input. This affects `declare' and `declare -p' as well. This only happens when the shell is not in POSIX mode, since POSIX.2 forbids this behavior.

    Bash-2.04 introduced the following new features:

  • Programmable word completion with the new `complete' and `compgen' builtins; examples are provided in examples/complete/complete-examples
  • `history' has a new `-d' option to delete a history entry
  • `bind' has a new `-x' option to bind key sequences to shell commands
  • The prompt expansion code has new `\j' and `\l' escape sequences
  • The `no_empty_cmd_completion' shell option, if enabled, inhibits command completion when TAB is typed on an empty line
  • `help' has a new `-s' option to print a usage synopsis
  • New arithmetic operators: var++, var--, ++var, --var, expr1,expr2 (comma)
  • New ksh93-style arithmetic for command: for ((expr1 ; expr2; expr3 )); do list; done<li>`read' has new options: `-t', `-n', `-d', `-s'
  • The redirection code handles several filenames specially: /dev/fd/N, /dev/stdin, /dev/stdout, /dev/stderr
  • The redirection code now recognizes /dev/tcp/HOST/PORT and /dev/udp/HOST/PORT and tries to open a TCP or UDP socket, respectively, to the specified port on the specified host
  • The ${!prefix*} expansion has been implemented
  • A new FUNCNAME variable, which expands to the name of a currently-executing function
  • The GROUPS variable is no longer readonly
  • A new shopt `xpg_echo' variable, to control the behavior of echo with respect to backslash-escape sequences at runtime
  • The NON_INTERACTIVE_LOGIN_SHELLS #define has returned

    The version of Readline released with Bash-2.04, Readline-4.1, had several new features as well:

  • Parentheses matching is always compiled into readline, and controllable with the new `blink-matching-paren' variable
  • The history-search-forward and history-search-backward functions now leave point at the end of the line when the search string is empty, like reverse-search-history, and forward-search-history
  • A new function for applications: rl_on_new_line_with_prompt()
  • New variables for applications: rl_already_prompted, and rl_gnu_readline_p

    Bash-2.03 had very few new features, in keeping with the convention that odd-numbered releases provide mainly bug fixes. A number of new features were added to Readline, mostly at the request of the Cygnus folks.

    A new shopt option, `restricted_shell', so that startup files can test whether or not the shell was started in restricted mode Filename generation is now performed on the words between ( and ) in compound array assignments (this is really a bug fix) OLDPWD is now auto-exported, as POSIX.2 requires ENV and BASH_ENV are read-only variables in a restricted shell Bash may now be linked against an already-installed Readline library, as long as the Readline library is version 4 or newer All shells begun with the `--login' option will source the login shell startup files, even if the shell is not interactive

    There were lots of changes to the version of the Readline library released along with Bash-2.03. For a complete list of the changes, read the file CHANGES in the Bash-2.03 distribution.

    Bash-2.02 contained the following new features:

    a new version of malloc (based on the old GNU malloc code in previous bash versions) that is more page-oriented, more conservative with memory usage, does not `orphan' large blocks when they are freed, is usable on 64-bit machines, and has allocation checking turned on unconditionally POSIX.2-style globbing character classes ([:alpha:], [:alnum:], etc.) POSIX.2-style globbing equivalence classes POSIX.2-style globbing collating symbols the ksh [[...]] extended conditional command the ksh egrep-style extended pattern matching operators a new `printf' builtin the ksh-like $(<filename) command substitution, which is equivalent to $(cat filename) new tilde prefixes that expand to directories from the directory stack new `**' arithmetic operator to do exponentiation case-insensitive globbing (filename expansion) menu completion a la tcsh `magic-space' history expansion function like tcsh the readline inputrc `language' has a new file inclusion directive ($include)

    Bash-2.01 contained only a few new features:

    new `GROUPS' builtin array variable containing the user's group list new bindable readline commands: history-and-alias-expand-line and alias-expand-line

    Bash-2.0 contained extensive changes and new features from bash-1.14.7. Here's a short list:

    new `time' reserved word to time pipelines, shell builtins, and shell functions one-dimensional arrays with a new compound assignment statement, appropriate expansion constructs and modifications to some of the builtins (read, declare, etc.) to use them new quoting syntaxes for ANSI-C string expansion and locale-specific string translation new expansions to do substring extraction, pattern replacement, and indirect variable expansion new builtins: `disown' and `shopt' new variables: HISTIGNORE, SHELLOPTS, PIPESTATUS, DIRSTACK, GLOBIGNORE, MACHTYPE, BASH_VERSINFO special handling of many unused or redundant variables removed (e.g., $notify, $glob_dot_filenames, $no_exit_on_failed_exec) dynamic loading of new builtin commands; many loadable examples provided new prompt expansions: \a, \e, \n, \H, \T, \@, \v, \V history and aliases available in shell scripts new readline variables: enable-keypad, mark-directories, input-meta, visible-stats, disable-completion, comment-begin new readline commands to manipulate the mark and operate on the region new readline emacs mode commands and bindings for ksh-88 compatibility updated and extended builtins new DEBUG trap expanded (and now documented) restricted shell mode

    implementation stuff: autoconf-based configuration nearly all of the bugs reported since version 1.14 have been fixed most builtins converted to use builtin `getopt' for consistency most builtins use -p option to display output in a reusable form (for consistency) grammar tighter and smaller (66 reduce-reduce conflicts gone) lots of code now smaller and faster test suite greatly expanded

    B2) Are there any user-visible incompatibilities between bash-3.2 and bash-2.05b?

    There are a few incompatibilities between version 2.05b and version 3.2. They are detailed in the file COMPAT in the bash distribution. That file is not meant to be all-encompassing; send mail to if if you find something that's not mentioned there.

  • [Dec 11, 2007] Working more productively with bash 2.x-3.x by Ian Macdonald

    Set this to to avoid having consecutive duplicate commands and other not so useful information appended to the history list. This will cut down on hitting the up arrow endlessly to get to the command before the one you just entered twenty times. It will also avoid filling a large percentage of your history list with useless commands.

    Try this:

    $ export HISTIGNORE="&:ls:ls *:mutt:[bf]g:exit"

    Using this, consecutive duplicate commands, invocations of ls, executions of the mutt mail client without any additional parameters, plus calls to the bg, fg and exit built-ins will not be appended to the history list.

    readline Tips and Tricks

    The readline library is used by bash and many other programs to read a line from the terminal, allowing the user to edit the line with standard Emacs editing keys.

    Advanced Bash-Scripting HOWTO by Mendel Cooper

    This is an older and more compact version of his Advanced Bash-Scripting Guide

    ! [Jun 10, 2007] Linux tip: Bash parameters and parameter expansions by Ian Shields

    Definitely gifted author !

    Do you sometimes wonder how to use parameters with your scripts, and how to pass them to internal functions or other scripts? Do you need to do simple validity tests on parameters or options, or perform simple extraction and replacement operations on the parameter strings? This tip helps you with parameter use and the various parameter expansions available in the bash shell.

    William Park's BASFDIFF

    BashDiff is a patch against Bash-3.0 shell, incorporating many useful features from Awk, Python, Zsh, Ksh, and others. It implements in the main core

    and as dynamically loadable builtins

    Release focus: Major feature enhancements

    This release adds a shell interface to GTK+2 widget library, for building a simple GUI dialog or layout. It uses XML syntax for layout, and returns the user's selection in a shell variable or runs a shell command as callback. The name of the 'xml' builtin has been changed to 'expat'. The <<+ here document now removes space and tab indents.

    William Park
    [contact developer]

    [Mar 21, 2005] Project details for Advanced Bash Scripting Guide

    More bugfixes were made, some of them fairly important. New material was added, including a few rather useful example scripts. This is more than a "minor" update, but not quite a major one.

    The Advanced Bash Scripting Guide is both a reference and a tutorial on shell scripting. This comprehensive book (the equivalent of about 646 print pages) covers almost every aspect of shell scripting. It contains over 300 profusely commented illustrative examples, and a number of tables. Not just a shell scripting tutorial, this book also provides an introduction to basic programming techniques, such as sorting and recursion. It is well suited for either individual study or classroom use.

    GeSHi - Generic Syntax Highlighter Home

    Can be used for shell scripts and bash initrc

    Bash Config Files

    How Bash executes startup files.

    For Login shells (subject to the -noprofile option):

    On logging in:
    If `/etc/profile' exists, then source it.

    If `~/.bash_profile' exists, then source it,
    else if `~/.bash_login' exists, then source it,
    else if `~/.profile' exists, then source it.

    On logging out:
    If `~/.bash_logout' exists, source it.

    For non-login interactive shells (subject to the -norc and -rcfile options):
    On starting up:
    If `~/.bashrc' exists, then source it.

    For non-interactive shells:
    On starting up:
    If the environment variable `ENV' is non-null, expand the variable and source the file named by the value. If Bash is not started in Posix mode, it looks for `BASH_ENV' before `ENV'.

    So, typically, your `~/.bash_profile' contains the line
    `if [ -f `~/.bashrc' ]; then source `~/.bashrc'; fi' after (or before) any login specific initializations.

    If Bash is invoked as `sh', it tries to mimic the behavior of `sh' as closely as possible. For a login shell, it attempts to source only `/etc/profile' and `~/.profile', in that order. The `-noprofile' option may still be used to disable this behavior. A shell invoked as `sh' does not attempt to source any other startup files.

    When Bash is started in POSIX mode, as with the `-posix' command line option, it follows the Posix 1003.2 standard for startup files. In this mode, the `ENV' variable is expanded and that file sourced; no other startup files are read.

    My .bashrc can be found here.

    My .bash_profile can be found here.

    .inputrc (readline)

    Although the Readline library comes with a set of Emacs-like key bindings installed by default, it is possible that you would like to use a different set of keybindings. You can customize programs that use Readline by putting commands in an "init" file in your home directory. The name of this file is taken from the value of the shell variable `INPUTRC'. If that variable is unset, the default is `~/.inputrc'.

    When a program which uses the Readline library starts up, the init file is read, and the key bindings are set.

    In addition, the `C-x C-r' command re-reads this init file, thus incorporating any changes that you might have made to it.

    Conditional Init Constructs within readline

    Readline implements a facility similar in spirit to the conditional compilation features of the C preprocessor which allows key bindings and variable settings to be performed as the result of tests. There are three parser directives used.

    `$if' The `$if' construct allows bindings to be made based on the editing mode, the terminal being used, or the application using Readline. The text of the test extends to the end of the line; no characters are required to isolate it.
    `mode' The `mode=' form of the `$if' directive is used to test whether Readline is in `emacs' or `vi' mode. This may be used in conjunction with the `set keymap' command, for instance, to set bindings in the `emacs-standard' and `emacs-ctlx' keymaps only if Readline is starting out in `emacs' mode.
    `term' The `term=' form may be used to include terminal-specific key bindings, perhaps to bind the key sequences output by the terminal's function keys. The word on the right side of the `=' is tested against the full name of the terminal and the portion of the terminal name before the first `-'. This allows SUN to match both SUN and SUN-CMD, for instance.
    `application' The APPLICATION construct is used to include application-specific settings. Each program using the Readline library sets the APPLICATION NAME, and you can test for it. This could be used to bind key sequences to
    functions useful for a specific program.
    `$endif' This command terminates an `$if' command.
    `$else' Commands in this branch of the `$if' directive are executed if the test fails.

    The following command adds a key sequence that quotes the current or previous word in Bash:
    $if bash
    # Quote the current or previous word
    "\C-xq": "\eb\"\ef\""

    My .inputrc file is here

    Last update by Hermann Heimhardt on October 7, 2001

    Patch against 2.05 Bash to make cd take 2 Args like ksh

    From: Eli
    Subject: Patch against 2.05 Bash to make cd take 2 Args like ksh
    Date: Fri, 22 Jun 2001 14:38:17 -0400

    *** origBash/bash-2.05/builtins/cd.def  Wed Oct 11 11:10:20 2000
    --- bash-2.05/builtins/cd.def   Fri Jun 22 14:31:08 2001
    *** 187,192 ****
    --- 187,225 ----
            lflag = interactive ? LCD_PRINTPATH : 0;
    +   else if (list->next)
    +     {
    +       /* if next then 2 args, so replace in PWD arg1 with arg2 */
    +       int beginLen, oldLen, newLen, endLen;
    +       char *replace;
    +       path = get_string_value ("PWD");
    +       if (( replace = strstr( path,list->word->word)) == (char *)0 )
    +         {
    +          builtin_error ("Couldn't find arg1 in PWD");
    +          return (EXECUTION_FAILURE);
    +       }
    +       beginLen = replace - path;
    +       oldLen = strlen( list->word->word);
    +       newLen = strlen( list->next->word->word);
    +       endLen = strlen( path + beginLen + oldLen ) + 1 ;
    +       dirname = xmalloc( beginLen + newLen + endLen );
    +       /* copy path up to begining of string to replace */
    +       memcpy( dirname, path, beginLen );
    +       /* then add new replacement string */
    +       memcpy( dirname + beginLen, list->next->word->word, newLen );
    +       /* finally add end of path after replacement */
    +       memcpy( dirname + beginLen + newLen, path + beginLen+
    +       printf("%s\n",dirname);
    +       if (change_to_directory (dirname, no_symlinks))
    +       {
    +          free(dirname);
    +          return (bindpwd (posixly_correct || no_symlinks));
    +       }
    +     }
        else if (absolute_pathname (list->word->word))
          dirname = list->word->word;
        else if (cdpath = get_string_value ("CDPATH"))

    [Nov 26, 2004] kcd Home Page

    kcd is a directory change utility under Linux or any other Unix clones. It helps you navigate the directory tree. You can supply the desired directory name in the command line and let kcd find it for you or let kcd show the entire directory tree and use arrow keys to go to the destination directory.

    Here is a list some features available in kcd:

    kcd is available as stable version and development version. You can distinguish development version from stable version by looking at its version number. Beginning from version 5.0.0, any version x.y.z where y is even is a stable version. Those where y is odd is a development version. Features currently present in the development version will eventually appear in the future stable version 8.0.0.

    kcd is distributed in source form under General Public License (GPL).

    The program and this web page is maintained by Kriang Lerdsuwanakij

    [Nov 15, 2004] Erwin Waterlander, WCD Wherever Change Directory

    Another Norton Change Directory (NCD) clone with more features.

    Wcd is a program to change directory fast. It saves time typing at the keyboard. One needs to type only a part of a directory name and wcd will jump to it. By default wcd searches for a directory with a name that begins with what has been typed, but the use of wildcards is also fully supported.
    For instance:

    wcd Desk

    will change to directory /home/waterlan/Desktop

    But also

    wcd *top

    will do that.

    Wcd is free to use and you can get the source code too.

    Some features of wcd:

  • Full screen interactive directory browser with speed search.
  • Present the user a list in case of multiple matches.
  • Wildcards *, ? and [SET] supported.
  • Directory stack, push pop.
  • Subdir definition possible. E.g. wcd subdira/subdirb
  • Long directory names support in Win95/98/NT DOS-box
  • Windows LAN UNC paths supported.
  • Change drive and directory at once.
  • Alias directories.
  • Ban directories.
  • 'cd' behaviour
  • Free portable source-code, no special libraries required
  • Multi platform:
    DOS 16 bit, DOS 32 bit, DOS bash, Windows 3.1/95/NT DOS-box, Cygwin bash, Unix ksh, csh, bash and zsh.
  • Wcd has been tested on: FreeDOS, MS-DOS 6.2, Win95, Win98, Windows NT 4.0, Linux, FreeBSD, HP-UX, SunOS, Solaris, SGI IRIX. Wcd works on any PC and can be ported to any Unix system.

    WCD is free software, distributed under GNU General Public License.

    [Nov 11, 2004] Freeware List for SPARC

    [Nov 5, 2004] Bash 3.0 now has degugger

    This is a terse description of the new features added to bash-3.0 since the release of bash-2.05b. As always, the manual page (doc/bash.1) is the place to look for complete descriptions.

    cc. The [[ ... ]] command has a new binary `=~' operator that performs extended regular expression (egrep-like) matching.

    l. New invocation option: --debugger. Enables debugging and turns on new `extdebug' shell option.

    f. HISTCONTROL may now include the `erasedups' option, which causes all lines matching a line being added to be removed from the history list.

    j. for, case, select, arithmetic commands now keep line number information for the debugger.

    p. `declare -F' now prints out extra line number and source file information if the `extdebug' option is set.

    r. New `caller' builtin to provide a call stack for the bash debugger.

    t. `for', `select', and `case' command heads are printed when `set -x' is enabled.

    u. There is a new {x..y} brace expansion, which is shorthand for {x.x+1, x+2,...,y}. x and y can be integers or single characters; the sequence may ascend or descend; the increment is always 1.

    v. New ksh93-like ${!array[@]} expansion, expands to all the keys (indices) of array.

    z. New `-o plusdirs' option to complete and compgen; if set, causes directory name completion to be performed and the results added to the rest of the possible completions.

    ee. Subexpressions matched by the =~ operator are placed in the new BASH_REMATCH array variable.

    gg. New `set -o pipefail' option that causes a pipeline to return a failure status if any of the processes in the pipeline fail, not just the last one.

    kk. The `\W' prompt expansion now abbreviates $HOME as `~', like `\w'.

    ll. The error message printed when bash cannot open a shell script supplied as argument 1 now includes the name of the shell, to better identify the error as coming from bash.

    2. New Features in Readline

    a. History expansion has a new `a' modifier equivalent to the `g' modifier for compatibility with the BSD csh.

    b. History expansion has a new `G' modifier equivalent to the BSD csh `g'
    modifier, which performs a substitution once per word.

    c. All non-incremental search operations may now undo the operation of replacing the current line with the history line.

    d. The text inserted by an `a' command in vi mode can be reinserted with `.'.

    e. New bindable variable, `show-all-if-unmodified'. If set, the readline completer will list possible completions immediately if there is more than one completion and partial completion cannot be performed.

    g. History list entries now contain timestamp information; the history file functions know how to read and write timestamp information associated with each entry.

    n. When listing completions, directories have a `/' appended if the `mark-directories' option has been enabled.

    Not much changed (Score:5, Insightful)
    by opk (149665) on Thursday July 29, @10:28AM (#9831965)
    Doesn't seem to be much changed given the version number increase. [[ =~ ]] can match regexes and it can do zsh style {1..3} expansions. Improved multibyte support too. There were bigger changes in some of the 2.0x updates.
    • Re:First "zsh rules" post! by Anonymous Coward (Score:3) Thursday July 29, @10:30AM
        Re:First "zsh rules" post! (Score:5, Informative)
        by opk (149665) on Thursday July 29, @10:46AM (#9832230)
        Globs are more powerful: **/*.c will recursively search for .c files: much quicker to type than find.
        You can match file types: e.g. *(@) will get you symlinks. *(U) gets files owned by you.

        Syntax for alternation is a lot easier. No @(this|that) or !(*.f). Instead, it is (this|that) and ^*.f

        Next point is completion. It includes a vast range of definitions so completion works well for lots of commands. The completion system handles completing parts of words so it better handles user@host completion. You get descriptions with completion match listings. Completion also has a really powerful context sensitive configuration system so you can make it work the way you like.

        It has modules. For running a simple shell script it will actually use less space than bash because it doesn't need to load the line editor and other interactive related code into memory.

        There is much much more. It takes a while to learn everything but if you just enable the completion functions (autoload -U compinit; compinit) you'll find it better than bash or tcsh from day 1.

    Re:Just wondering... (Score:5, Informative)
    by opk (149665) on Thursday July 29, @11:05AM (#9832448)
    Zsh is still the best.

    Bash developers have different priorities.
    Bash became the default primarily because it is GNU.
    Zsh has some ugly but powerful features like nested expansions. The two areas where bash is better than zsh is multibyte support and POSIX compliance. Much of that was contributed by IBM and Apple respectively. But if you use the shell a lot, you'll find zsh does a lot of things better. The completion is amazing. And when it isn't emulating sh/posix, it fixes some of the broken design decisions (like word splitting of variables) which saves you from doing stupid things.

    The FSF actually does development in a very closed manner when it can (the gcc egcs split was partly because of this). Bash is a good example of this. That perhaps a good thing because it is probably good that bash doesn't get some of zsh's nasty (but powerful) features. And if zsh didn't exist, bash might have been forked by now. If you care about your shell, you'll find much more of a community on the zsh lists than the spam filled bug-bash list. You can't even get at alpha releases of bash without being one of the chosen few.

    Can arrow key history be like Matlab's? (Score:3, Interesting)
    by dara (119068) on Thursday July 29, @10:54AM (#9832323)
    I read the announcement and it mentions "History traversal with arrow keys", but what I would really like doesn't seem to be mentioned (but perhaps it is possible with bash-2.05, I'm not much of a shell expert). In Matlab, the up-arrow key searches the history for commands that match all the characters on the line. No characters and it acts like a normal bash arrow, if "figure, plot" is at the beginning of the line, it will quickly scroll through all plotting commands that have been entered at the shell.

    Any idea if this is possible?

    Dara Parsavand

    TThe Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
    Re:Can arrow key history be like Matlab's? (Score:2)
    by Atzanteol (99067) on Thursday July 29, @11:10AM (#9832500)
    Try 'ctrl+r'. Not *exactly* what you're looking for, but it lets you search through your history.

    i.e. on an empty line hit ctrl+r, then start typing.

    [ Parent ]
    Re:Can arrow key history be like Matlab's? (Score:3, Informative)
    by Anonymous Coward on Thursday July 29, @12:38PM (#9833770)
    cat >> ~/.inputrc
    "\e[A": history-search-backward
    "\e[B": history-search-forward
    Re:Can arrow key history be like Matlab's? (Score:1, Informative)
    by Anonymous Coward on Thursday July 29, @01:17PM (#9834391)
    Here you are,put this in .inputrc:

    "\e[6~": history-search-forward
    "\e[5~": history-search-backward

    and use page-up pagge-down to search

    I can live whithout it since 4dos

    Autocompletion like W2K? (Score:2)
    by tstoneman (589372) on Thursday July 29, @01:12PM (#9834316)
    I like bash, but the one thing that it doesn't support (out-of-the-box anyway) is auto-completion a la W2K. In NT, when you hit tab, you can cycle through all the words that can complete the letters you typed... on bash, it shows you a list.

    Is there a way to make bash behave more like W2K in this sense?

    bash completion getting better (Score:3, Informative)
    by AT (21754) on Thursday July 29, @11:08AM (#9832478)
    The completion ability of bash has been steadily improving. There is a nice script here [] that sets up a lot of good completion rules for bash.
    Re:First "zsh rules" post! (Score:2)
    by Just Some Guy (3352) <`kirk+slashdot' `at' `'> on Thursday July 29, @05:30PM (#9837797)
    ( | Last Journal: Monday September 27, @09:09AM)
    That's just... I mean... Wow. Really. I just patched my .zshenv with

    - export SSH_AUTH_SOCK=$(find 2>/dev/null /tmp -maxdepth 2 -type s -user $USER -regex '/tmp/ssh-.*/agent\..*')
    + export SSH_AUTH_SOCK=$(echo /tmp/ssh-*/agent.*(=UN))

    That's sweet. Thanks for the pointer! The more I learn about zsh, the more I love it.

    Re:First "zsh rules" post! (Score:1)
    by sorbits (516598) on Saturday July 31, @01:09PM (#9853305)
    I will certainly give it a try then!

    Until now I have sticked with tcsh for one single reason: history substition []!

    Basically it lets me insert text from my history (including the current line) using few symbols (e.g. !$ is the last argument of the previous line) -- it's extremely powerful, e.g. it allows to search in the history and can do substitutions in results, or head/tail for paths etc.

    I use it a lot to keep down the number of characters I need to type, and I have even assigned hotkeys to some of the substitutions I use the most.

    This is really the make-or-break feature for wether or not I want to use a shell, so I really hope zsh has something similar!?!

    Re:First "zsh rules" post! (Score:5, Informative)
    by Just Some Guy (3352) <`kirk+slashdot' `at' `'> on Thursday July 29, @10:50AM (#9832279)
    ( | Last Journal: Monday September 27, @09:09AM)
    Bigs ones for me:
    • A sane auto-completion system. That is, "cvs <tab>" gives a list of all of the commands that cvs understands. "cvs -<tab>" (same as above but tabbing after typing "-") gives a list of all of the options that cvs understands. These are good things. Now, in fairness, bash also has a command completion library. Unfortunately, it's implemented as a huge set of Bash functions. In zsh, "set|wc" returns 179 lines. In bash, "set|wc" returns 3,961 lines. The net effect is that zsh's system is noticeably faster and less polluting to the local environment.
    • Modules. Wrappers for TCP connections, a built-in cron thingy, and PCRE are all loadable modules to do tricky things easily.
    • Lots of pre-defined things. Load the "colors" and "zsh/terminfo" modules and you get defined associative arrays like $fg, which emits terminal-appropriate escape codes to set the foreground color of printed text. The command "echo ${fg[red]}red text${fg[default]}normal text" prints "red text" in red, and "normal text" in your default color.

    Bash is a good shell, and I have nothing bad to say about it. However, zsh seems to have been designed from the ground up by power users and for power users. I absolutely love it and everyone that I've given a example config file to (to get them running with little hassle) has permanently switched.

    Re:First "zsh rules" post! (Score:5, Informative)
    by Just Some Guy (3352) <`kirk+slashdot' `at' `'> on Thursday July 29, @11:21AM (#9832638)
    ( | Last Journal: Monday September 27, @09:09AM)
    As the maintainer of FreeBSD's bash-completion [] port, I'm reasonably familiar with it. Yes, it's approximately as powerful as zsh's completion module. Still, have you ever looked at it? It's a giant set of defined functions and glue. Seriously, get to a bash prompt and type "set" to see all of the things that've been stuffed into your shell's namespace. Now, try that with zsh and be pleasantly surprised.

    As I said in another post, a big side effect is that zsh's completions seem to be much faster than bash's. That alone is worth the price of admission for me.

    Re:Dear Apple haters... (Score:5, Informative)
    by Jahf (21968) on Thursday July 29, @10:58AM (#9832361)
    (Last Journal: Thursday August 05, @03:55PM)
    Believe it or not, -most- of the large companies that use GPL'ed tools give back to the community.

    Apple has done numerous fixes, not just on BASH.

    Sun (disclaimer: for whom I work) has done -tons- of work on GNOME, Mozilla and don't forget Open Office (just to name a few).

    IBM works on many projects and gives back ... plus contributing all new things like JFS.

    All the distro makers like Red Hat, Novell, etc give back tons.

    Each of those companies pay engineers to fix pieces not done in Open Source projects as well as to extend them for their customers. The patches are covered under GPL just like the main code, and these companies know it and yet knowingly dedicate serious money and hours to these projects. And then they satisfy the GPL by putting them out on source CDs or submitting them back to the main projects.

    The big problem for getting submitted code accepted is that these companies are usually fixing and developing on a codebase that is aging. For instance, Sun did numerous I18N fixes for GNOME 2.6, but by the time they were ready the main GNOME organization had moved on to 2.8. That means there is a disconnect between the two and the changes have to be ported forward before they will hit the main code branch. The same problem can happen with kernel patches and just about any other codebase that changes versions so quickly.

    Sorry, you were doing the good thing and pointing out Apple's contributions. But so many people think these companies violate the GPL (in spirit if not in law) when they are very large contributors to open source. Sure, some do, and the community usually find out about it and shame them into minimal compliance (Linksys and Sveasoft come to mind after my delving into alternate WRT54G firmwares last night), but generally speaking the big companies have been a good part of the community.

    Re:On the list of changes: (Score:5, Informative)
    by Prowl (554277) on Thursday July 29, @11:12AM (#9832526)
    GNU or Unix would seem to be the most appropriate

    bash has been around since 1989 (according to copywrite on man page). Linux 1.0 came around 5 years later.

    The editors should know better, unless they're intentionally trying to piss off RMS

    Looks great, but prefer Ash for scripts (Score:3, Interesting)
    by Etcetera (14711) * <<cleaver> <at> <>> on Thursday July 29, @10:41AM (#9832171)

    Looks like a nice Unicode-savvy release that should help with dealing with international languages at the command line. And yay to Apple for giving back (again). When will people finally accept that Apple is indeed helping out the OSS community through GCC, bash, and other tools...?

    Kind of off-topic, but for speed purposes in scripts that have to run fast, I find nothing better or more convenient than Ash, especially on systems where /bin/sh is a symlink to /bin/bash.

    Does anyone know any history on this shell? Is it a clone of the original bourne shell or of bash? I can't seem to find anything useful on Google ...

    Re:Looks great, but prefer Ash for scripts (Score:2)
    by mihalis (28146) on Thursday July 29, @10:52AM (#9832301)
    As I understand it, ash was written by Kenneth Almquist. I used to see his name on some of the Ada related mailing lists and newsgroups.
    Re:Looks great, but prefer Ash for scripts (Score:2, Informative)
    by Stephen Williams (23750) on Thursday July 29, @11:46AM (#9832931)
    ( | Last Journal: Thursday December 05, @05:02AM)
    Ash (or dash, as it's called nowadays) is a Linux port of NetBSD's /bin/sh. It's a POSIX shell with separate lineage from bash.

    It's /bin/sh on my system too. Faster and smaller than bash; watch those configure scripts fly!


    Some people don't agree. (Score:2)
    by emil (695) on Thursday July 29, @12:48PM (#9833940)
    Ash appears to consume large amounts of memory, and some people in BSD circles have serious objections to it.

    See the discussion here [] (scroll down a bit into the postings). I don't have an opinion on the issue one way or another.

    And can I set up bash so I can, for instance, move from rc2.d to rc3.d by typing

    $ cd 2 3

    [May 16, 2004] Project details for BASH Debugger

    BASH Debugger provides a patched BASH that enables better debugging support as well as improved error reporting. It also contains the most comprehensive source code debugger for BASH that has been written. It can be used as a springboard for other experimental features (such as a timestamped history file), since dnter"

    Recommended Links

    Softpanorama hot topic of the month

    Softpanorama Recommended

    Please visit  Heiner Steven SHELLdorado  the best shell scripting site on the Internet
    Please visit nixCraft
     blog by



    Examples shipped with bash 3.2 and newer
    Path Description X-ref
    ./bashdb Deprecated sample implementation of a bash debugger.  
    ./complete Shell completion code.  
    ./functions Example functions.  
    ./functions/array-stuff Various array functions (ashift, array_sort, reverse).  
    ./functions/array-to-string Convert an array to a string.  
    ./functions/autoload An almost ksh-compatible 'autoload' (no lazy load). ksh
    ./functions/autoload.v2 An almost ksh-compatible 'autoload' (no lazy load). ksh
    ./functions/autoload.v3 A more ksh-compatible 'autoload' (with lazy load). ksh
    ./functions/basename A replacement for basename(1). basename
    ./functions/basename2 Fast basename(1) and dirname(1) functions for bash/sh. basename, dirname
    ./functions/coproc.bash Start, control, and end co-processes.  
    ./functions/coshell.bash Control shell co-processes (see coprocess.bash).  
    ./functions/coshell.README README for coshell and coproc.  
    ./functions/csh-compat A C-shell compatibility package. csh
    ./functions/dirfuncs Directory manipulation functions from the book The Korn Shell.  
    ./functions/dirname A replacement for dirname(1). dirname
    ./functions/emptydir Find out if a directory is empty.  
    ./functions/exitstat Display the exit status of processes.  
    ./functions/external Like command, but forces the use of external command.  
    ./functions/fact Recursive factorial function.  
    ./functions/fstty Front-end to sync TERM changes to both stty(1) and readline 'bind'. stty.bash
    ./functions/func Print out definitions for functions named by arguments.  
    ./functions/gethtml Get a web page from a remote server (wget(1) in bash).  
    ./functions/getoptx.bash getopt function that parses long-named options.  
    ./functions/inetaddr Internet address conversion (inet2hex and hex2inet).  
    ./functions/inpath Return zero if the argument is in the path and executable. inpath
    ./functions/isnum.bash Test user input on numeric or character value.  
    ./functions/isnum2 Test user input on numeric values, with floating point.  
    ./functions/isvalidip Test user input for valid IP addresses.  
    ./functions/jdate.bash Julian date conversion.  
    ./functions/jj.bash Look for running jobs.  
    ./functions/keep Try to keep some programs in the foreground and running.  
    ./functions/ksh-cd ksh-like cd: cd [-LP] [dir[change]]. ksh
    ./functions/ksh-compat-test ksh-like arithmetic test replacements. ksh
    ./functions/kshenv Functions and aliases to provide the beginnings of a ksh environment for bash ksh
    ./functions/login Replace the login and newgrp built-ins in old Bourne shells.  
    ./functions/lowercase Rename files to lowercase. rename lower
    ./functions/manpage Find and print a manpage. fman
    ./functions/mhfold Print MH folders, useful only because folders(1) doesn't print mod date/times.  
    ./functions/notify.bash Notify when jobs change status.  
    ./functions/pathfuncs Path related functions (no_path, add_path, pre-path, del_path). path
    ./functions/README README  
    ./functions/recurse Recursive directory traverser.  
    ./functions/repeat2 A clone of the C shell built-in repeat. repeat, csh
    ./functions/repeat3 A clone of the C shell built-in repeat. repeat, csh
    ./functions/seq Generate a sequence from m to n;m defaults to 1.  
    ./functions/seq2 Generate a sequence from m to n;m defaults to 1.  
    ./functions/shcat Readline-based pager. cat, readline pager
    ./functions/shcat2 Readline-based pagers. cat, readline pager
    ./functions/sort-pos-params Sort the positional parameters.  
    ./functions/substr A function to emulate the ancient ksh built-in. ksh
    ./functions/substr2 A function to emulate the ancient ksh built-in. ksh
    ./functions/term A shell function to set the terminal type interactively or not.  
    ./functions/whatis An implementation of the 10th Edition Unix sh built-in whatis(1) command.  
    ./functions/whence An almost ksh-compatible whence(1) command.  
    ./functions/which An emulation of which(1) as it appears in FreeBSD.  
    ./functions/xalias.bash Convert csh alias commands to bash functions. csh, aliasconv
    ./functions/xfind.bash A find(1) clone.  
    ./loadables/ Example loadable replacements.  
    ./loadables/basename.c Return nondirectory portion of pathname. basename
    ./loadables/cat.c cat(1) replacement with no options—the way cat was intended. cat, readline pager
    ./loadables/cut.c cut(1) replacement.  
    ./loadables/dirname.c Return directory portion of pathname. dirname
    ./loadables/finfo.c Print file info.  
    ./loadables/getconf.c POSIX.2 getconf utility.
    ./loadables/getconf.h Replacement definitions for ones the system doesn't provide.  
    ./loadables/head.c Copy first part of files.  
    ./loadables/hello.c Obligatory "Hello World" / sample loadable.  
    ./loadables/id.c POSIX.2 user identity.  
    ./loadables/ln.c Make links.  
    ./loadables/logname.c Print login name of current user.  
    ./loadables/ Simple makefile for the sample loadable built-ins.  
    ./loadables/mkdir.c Make directories.  
    ./loadables/necho.c echo without options or argument interpretation.  
    ./loadables/pathchk.c Check pathnames for validity and portability.  
    ./loadables/print.c Loadable ksh-93 style print built-in.  
    ./loadables/printenv.c Minimal built-in clone of BSD printenv(1).  
    ./loadables/push.c Anyone remember TOPS-20?  
    ./loadables/README README  
    ./loadables/realpath.c Canonicalize pathnames, resolving symlinks.  
    ./loadables/rmdir.c Remove directory.  
    ./loadables/sleep.c Sleep for fractions of a second.  
    ./loadables/strftime.c Loadable built-in interface to strftime(3).  
    ./loadables/sync.c Sync the disks by forcing pending filesystem writes to complete.  
    ./loadables/tee.c Duplicate standard input.  
    ./loadables/template.c Example template for loadable built-in.  
    ./loadables/truefalse.c True and false built-ins.  
    ./loadables/tty.c Return terminal name.  
    ./loadables/uname.c Print system information.  
    ./loadables/unlink.c Remove a directory entry.  
    ./loadables/whoami.c Print out username of current user.  
    ./loadables/perl/ Illustrates how to build a Perl interpreter into bash.  
    ./misc Miscellaneous  
    ./misc/aliasconv.bash Convert csh aliases to bash aliases and functions. csh, xalias
    ./misc/ Convert csh aliases to bash aliases and functions. csh, xalias
    ./misc/cshtobash Convert csh aliases, environment variables, and variables to bash equivalents. csh, xalias
    ./misc/README README  
    ./misc/suncmd.termcap SunView TERMCAP string.  
    ./obashdb Modified version of the Korn Shell debugger from Bill Rosenblatt's Learning the Korn Shell.
    ./scripts.noah Noah Friedman's collection of scripts (updated to bash v2 syntax by Chet Ramey).  
    ./scripts.noah/aref.bash Pseudo-arrays and substring indexing examples.  
    ./scripts.noah/bash.sub.bash Library functions used by require.bash.  
    ./scripts.noah/bash_version. bash A function to slice up $BASH_VERSION.  
    ./scripts.noah/meta.bash Enable and disable eight-bit readline input.  
    ./scripts.noah/mktmp.bash Make a temporary file with a unique name.  
    ./scripts.noah/number.bash A fun hack to translate numerals into English.  
    ./scripts.noah/PERMISSION Permissions to use the scripts in this directory.  
    ./scripts.noah/prompt.bash A way to set PS1 to some predefined strings.  
    ./scripts.noah/README README  
    ./scripts.noah/remap_keys.bash A front end to bind to redo readline bindings. readline
    ./scripts.noah/require.bash Lisp-like require/provide library functions for bash.  
    ./scripts.noah/send_mail. Replacement SMTP client written in bash.  
    ./scripts.noah/shcat.bash bash replacement for cat(1). cat
    ./scripts.noah/source.bash Replacement for source that uses current directory.  
    ./scripts.noah/string.bash The string(3) functions at the shell level.  
    ./scripts.noah/stty.bash Front-end to stty(1) that changes readline bindings too. fstty
    ./scripts.noah/y_or_n_p.bash Prompt for a yes/no/quit answer. ask
    ./scripts.v2 John DuBois' ksh script collection (converted to bash v2 syntax by Chet Ramey).  
    ./scripts.v2/arc2tarz Convert an arc archive to a compressed tar archive.  
    ./scripts.v2/bashrand Random number generator with upper and lower bounds and optional seed. random
    ./scripts.v2/cal2day.bash Convert a day number to a name.  
    ./scripts.v2/cdhist.bash cd replacement with a directory stack added.  
    ./scripts.v2/corename Tell what produced a core file.  
    ./scripts.v2/fman Fast man(1) replacement. manpage
    ./scripts.v2/frcp Copy files using ftp(1) but with rcp-type command-line syntax.  
    ./scripts.v2/lowercase Change filenames to lowercase. rename lower
    ./scripts.v2/ncp A nicer front end for cp(1) (has -i, etc)..  
    ./scripts.v2/newext Change the extension of a group of files. rename
    ./scripts.v2/nmv A nicer front end for mv(1) (has -i, etc).. rename
    ./scripts.v2/pages Print specified pages from files.  
    ./scripts.v2/PERMISSION Permissions to use the scripts in this directory.  
    ./scripts.v2/pf A pager front end that handles compressed files.  
    ./scripts.v2/pmtop Poor man's top(1) for SunOS 4.x and BSD/OS.  
    ./scripts.v2/README README  
    ./scripts.v2/ren Rename files by changing parts of filenames that match a pattern. rename
    ./scripts.v2/rename Change the names of files that match a pattern. rename
    ./scripts.v2/repeat Execute a command multiple times. repeat
    ./scripts.v2/shprof Line profiler for bash scripts.  
    ./scripts.v2/untar Unarchive a (possibly compressed) tarfile into a directory.  
    ./scripts.v2/uudec Carefully uudecode(1) multiple files.  
    ./scripts.v2/uuenc uuencode(1) multiple files.  
    ./scripts.v2/vtree Print a visual display of a directory tree. tree
    ./scripts.v2/where Show where commands that match a pattern are.  
    ./scripts Example scripts.  
    ./scripts/ Text adventure game in bash!  
    ./scripts/ Bourne shell's C shell emulator. csh
    ./scripts/ Readline-based pager. cat, readline pager
    ./scripts/center Center a group of lines.  
    ./scripts/ Line editor using only /bin/sh, /bin/dd, and /bin/rm.  
    ./scripts/fixfiles.bash Recurse a tree and fix files containing various bad characters.  
    ./scripts/hanoi.bash The inevitable Towers of Hanoi in bash.  
    ./scripts/inpath Search $PATH for a file the same name as $1; return TRUE if found. inpath
    ./scripts/krand.bash Produces a random number within integer limits. random
    ./scripts/line-input.bash Line input routine for GNU Bourne Again Shell plus terminal-control primitives.  
    ./scripts/nohup.bash bash version of nohup command.  
    ./scripts/precedence Test relative precedences for && and || operators.  
    ./scripts/randomcard.bash Print a random card from a card deck. random
    ./scripts/README README  
    ./scripts/scrollbar Display scrolling text.  
    ./scripts/scrollbar2 Display scrolling text.  
    ./scripts/self-repro A self-reproducing script (careful!).  
    ./scripts/showperm.bash Convert ls(1) symbolic permissions into octal mode.  
    ./scripts/shprompt Display a prompt and get an answer satisfying certain criteria. ask
    ./scripts/spin.bash Display a spinning wheel to show progress.  
    ./scripts/timeout Give rsh(1) a shorter timeout.  
    ./scripts/vtree2 Display a tree printout of the direcotry with disk use in 1k blocks. tree
    ./scripts/vtree3 Display a graphical tree printout of dir. tree
    ./scripts/vtree3a Display a graphical tree printout of dir. tree
    ./scripts/ A web server in bash!  
    ./scripts/xterm_title Print the contents of the xterm title bar.  
    ./scripts/zprintf Emulate printf (obsolete since printf is now a bash built-in).  
    ./startup-files Example startup files.  
    ./startup-files/Bash_aliases Some useful aliases (written by Fox).  
    ./startup-files/Bash_profile Sample startup file for bash login shells (written by Fox).  
    ./startup-files/bash-profile Sample startup file for bash login shells (written by Ramey).  
    ./startup-files/bashrc Sample Bourne Again Shell init file (written by Ramey).  
    ./startup-files/Bashrc.bfox Sample Bourne Again Shell init file (written by Fox).  
    ./startup-files/README README  
    ./startup-files/apple Example startup files for Mac OS X.  
    ./startup-files/apple/aliases Sample aliases for Mac OS X.  
    ./startup-files/apple/bash.defaults Sample User preferences file.  
    ./startup-files/apple/environment Sample Bourne Again Shell environment file.  
    ./startup-files/apple/login Sample login wrapper.  
    ./startup-files/apple/logout Sample logout wrapper.  
    ./startup-files/apple/rc Sample Bourne Again Shell config file.  
    ./startup-files/apple/README README

    Things bash has or uses that ksh88 does not:

    long invocation options

    [-+]O invocation option

    `!' reserved word

    arithmetic for command: for ((expr1 ; expr2; expr3 )); do list; done

    posix mode and posix conformance

    command hashing

    tilde expansion for assignment statements that look like $PATH

    process substitution with named pipes if /dev/fd is not available

    the ${!param} indirect parameter expansion operator

    the ${!param*} prefix expansion operator

    the ${param:offset[:length]} parameter substring operator

    the ${param/pat[/string]} parameter pattern substitution operator






       GROUPS, FUNCNAME, histchars, auto_resume

    prompt expansion with backslash escapes and command substitution

    redirection: &> (stdout and stderr)

    more extensive and extensible editing and programmable completion

    builtins: bind, builtin, command, declare, dirs, echo -e/-E, enable,

      exec -l/-c/-a, fc -s, export -n/-f/-p, hash, help, history,

      jobs -x/-r/-s, kill -s/-n/-l, local, logout, popd, pushd,

      read -e/-p/-a/-t/-n/-d/-s, readonly -a/-n/-f/-p,

      set -o braceexpand/-o histexpand/-o interactive-comments/

      -o notify/-o physical/-o posix/-o hashall/-o onecmd/

      -h/-B/-C/-b/-H/-P, set +o, suspend, trap -l, type,

      typeset -a/-F/-p, ulimit -u, umask -S, alias -p, shopt,

      disown, printf, complete, compgen

    `!' csh-style history expansion

    POSIX.2-style globbing character classes

    POSIX.2-style globbing equivalence classes

    POSIX.2-style globbing collating symbols

    egrep-like extended pattern matching operators

    case-insensitive pattern matching and globbing

    `**' arithmetic operator to do exponentiation

    redirection to /dev/fd/N, /dev/stdin, /dev/stdout, /dev/stderr

    arrays of unlimited size

    Things ksh88 has or uses that bash does not:

    tracked aliases (alias -t)


    co-processes (|&, >&p, <&p)

    weirdly-scoped functions

    typeset +f to list all function names without definitions

    text of command history kept in a file, not memory

    builtins: alias -x, cd old new, fc -e -, newgrp, print,

      read -p/-s/-u/var?prompt, set -A/-o gmacs/

      -o bgnice/-o markdirs/-o nolog/-o trackall/-o viraw/-s,

      typeset -H/-L/-R/-Z/-A/-ft/-fu/-fx/-l/-u/-t, whence

    using environment to pass attributes of exported variables

    arithmetic evaluation done on arguments to some builtins

    reads .profile from $PWD when invoked as login shell

    Implementation differences:

    Bash Scripting

    function korn bash
    simple output print echo
    discipline functions yes no
    POSIX character classes yes no
    help no yes
    'cd' spelling correction no yes
    arithmetic (C-style) for yes no
    arithmetic bases 2-36 2-64
    array initialization set -A USERVAR value1 .... USERVAR=(value1 ....)
    array size limited unlimited
    associative arrays yes no
    compond arrays yes no


    [gnu.bash.bug] BASH Frequently-Asked Questions (FAQ version 3.29)

    FAQ This is the Bash FAQ, version 3.29, for Bash version 3.0.

    Random Findings

    Re Suggestions for corrections to executable.el - use of PATHEXT

    From: Lennart Borgman
    Subject: Re: Suggestions for corrections to executable.el - use of PATHEXT
    Date: Sun, 12 Sep 2004 12:56:08 +0200

    From: "Eli Zaretskii" <>
    > First, I'm not sure we should look at PATHEXT.  That variable is AFAIK
    > looked at by the shell, so if we want Emacs behave _exactly_ like the
    > shell does, we should at least look at the value of SHELL and/or
    > ComSpec (and COMSPEC for older systems).  I mean, what if the user's
    > shell is Bash, which AFAIK doesn't look at PATHEXT at all?  And if the
    > shell is COMMAND.COM, then ".cmd" should not be in the list.  Etc.,
    > etc.
    PATHEXT is looked at by cmd.exe (the default shell on the NT hereditary
    line). I do not know if it is used by (the default shell on the
    95 line) but I doubt it. When I tested now I found that the Run entry in
    Windows Start menu honor the default extensions for PATHEXT (.com, .exe.,
    .bat, .cmd). It does not however not recognize .pl which I have in my
    PATHEXT (cmd.exe recognize it). I am using NT4 when testing this.
    So perhaps not even ms windows is consistent here. What seems clear however
    is that the main purpose of PATHEXT is as far as I can see to make it easier
    for the user when entering a command interactively. The user may for example
    type "notepad" instead of "notepad.exe".
    PATHEXT is set by the user and expresses the users wish to type less. It
    seems reasonable to use PATHEXT for this purpose in Emacs too. The variable
    executable-binary-suffixes is (if I understand this correctly) used for this
    purpose by executable-find. This is however not clearly expressed in the
    A note: w32-shell-execute does something quite different. It calls the ms
    windows API ShellExecute to do the action associated with a certain "verb"
    on a file type (on windows this means file extension). Typical verbs are
    "open" and "print". Windows Explorer uses this.
    Having said all this I just want to say that I regret that I took this issue
    up without looking closer at the problem.
    - Lennart


    FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes.   If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner. 

    ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.  


    Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy


    War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes


    Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law


    Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

    Classic books:

    The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

    Most popular humor pages:

    Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

    The Last but not Least

    Copyright © 1996-2016 by Dr. Nikolai Bezroukov. was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.

    The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

    Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

    FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

    This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

    You can use PayPal to make a contribution, supporting development of this site and speed up access. In case is down you can use the at


    The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

    Last modified: February 12, 2017