Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
May the source be with you, but remember the KISS principle ;-)

Shellorama, 2004

Prev : Index  : Next

Top Visited
Past week
Past month


Old News ;-)

[Jan 25, 2005] Encrypting Shell Scripts - The Community's Center for Security

Do you have scripts that contain sensitive information like passwords and you pretty much depend on file permissions to keep it secure? If so, then that type of security is good provided you keep your system secure and some user doesn't have a "ps -ef" loop running in an attempt to capture that sensitive info (though some applications mask passwords in "ps" output). There is a program called "shc" that can be used to add an extra layer of security to those shell scripts. SHC will encrypt shell scripts using RC4 and make an executable binary out of the shell script and run it as a normal shell script. This utility is great for programs that require a password to either encrypt, decrypt, or require a password that can be passed to a command line argument.

Download shc ( and untar it:

tar -xzvf shc-X.X.tgz
cd shc-X.X/
make install

A binary named "shc" will be created along with some test programs. Let's give it a try.

Create a file called: "" and add the following contents:

############################### ##############################

echo "I love Duane's articles and will send him a donation via PayPal."

############################### ##############################

Now run the command:

shc -f

The switch "-f" specifies the source script to encrypt. The above command will create two files: and

The program "shc" creates C source code out of your shell script then encrypts it ( The encrypted shell script is: Run that binary and see the output:

I love Duane's articles and will send him a donation via PayPal.

Now copy the original "" file to a floppy disk or some other system for backup or in case you need to edit it in the future. Then, delete it from the server and delete the "" file it creates.

Neat feature

You can also specify a time limit on the shell script so that it will no longer execute after a certain date and you can specify a custom message to echo back to the user. Run this command on the "" file we created earlier in this tut:

shc -e 09/10/2004 -m "Dude it is too late to run this script." -f
./ has expired!
Dude it is too late to run this script.

In the above command the date October 9, 2004 is set as the expiration date (-e 09/10/2004) and the custom message was set to display to the user (-m "Dude it is too late to run this script.") when the binary is executed. Note the date format is dd/mm/yyyy.

Check out the man pages for more info on "shc". Remember that the binary is only encrypted on the local system. If you encrypt a script that transmits sensitive

Sys Admin Magazinev13, i10 Trapping Special Characters in the Korn Shell

Have you ever needed to know whether a user pressed the up arrow key or some other unprintable character from within a shell script? The Korn shell provides no command for detecting whether a user has pressed a special character (arrow keys, function keys, and control key sequences). With a little programming, and by setting certain terminal driver options with the stty command, you can detect these keys. In this column, we'll:

KSH - Redirection and Pipes

Indirect redirection

Additionally, there is a special type of pipes+redirection. This only works on systems with /dev/fd/X support. You can automatically generate a "fake" file as the result of a command that does not normally generate a file. The name of the fake files will be /dev/fd/{somenumberhere}

Here's an example that doesnt do anything useful

wc -l <(echo one line) <(echo another line)

wc will report that it saw two files, "/dev/fd/4", and "/dev/fd/5", and each "file" had 1 line each. From its own perspective, wc was called simply as

wc -l /dev/fd/4 /dev/fd/5

There are two useful components to this:

  1. You can handle MULTIPLE commands' output at once
  2. It's a quick-n-dirty way to interface commands that "require" a file name, with other commands that won't let you specify output files


A modular Perl shell written, configured, and operated entirely in Perl. It aspires to be a fully operational login shell with all the features one normally expects. But it also gives direct access to Perl objects and data structures from the command line, and allows you to run Perl code within the scope of your command line. And it's named after one of the greatest characters on Futurama, so it must be good...

[Nov 15, 2004] WCD Wherever Change Directory by Erwin Waterlander

Another Norton Change Directory (NCD) clone with more features.

Wcd is a program to change directory fast. It saves time typing at the keyboard. One needs to type only a part of a directory name and wcd will jump to it. By default wcd searches for a directory with a name that begins with what has been typed, but the use of wildcards is also fully supported.
For instance:

wcd Desk

will change to directory /home/waterlan/Desktop

But also

wcd *top

will do that.

Wcd is free to use and you can get the source code too.

Some features of wcd:

  • Full screen interactive directory browser with speed search.
  • Present the user a list in case of multiple matches.
  • Wildcards *, ? and [SET] supported.
  • Directory stack, push pop.
  • Subdir definition possible. E.g. wcd subdira/subdirb
  • Long directory names support in Win95/98/NT DOS-box
  • Windows LAN UNC paths supported.
  • Change drive and directory at once.
  • Alias directories.
  • Ban directories.
  • 'cd' behaviour
  • Free portable source-code, no special libraries required
  • Multi platform:
    DOS 16 bit, DOS 32 bit, DOS bash, Windows 3.1/95/NT DOS-box, Cygwin bash, Unix ksh, csh, bash and zsh.
  • Wcd has been tested on: FreeDOS, MS-DOS 6.2, Win95, Win98, Windows NT 4.0, Linux, FreeBSD, HP-UX, SunOS, Solaris, SGI IRIX. Wcd works on any PC and can be ported to any Unix system.

    WCD is free software, distributed under GNU General Public License.

    [Nov 11, 2004] Freeware List for SPARC

    [Nov 5, 2004] Bash 3.0 now has degugger

    This is a terse description of the new features added to bash-3.0 since the release of bash-2.05b. As always, the manual page (doc/bash.1) is the place to look for complete descriptions.

    cc. The [[ ... ]] command has a new binary `=~' operator that performs extended regular expression (egrep-like) matching.

    l. New invocation option: --debugger. Enables debugging and turns on new `extdebug' shell option.

    f. HISTCONTROL may now include the `erasedups' option, which causes all lines matching a line being added to be removed from the history list.

    j. for, case, select, arithmetic commands now keep line number information for the debugger.

    p. `declare -F' now prints out extra line number and source file information if the `extdebug' option is set.

    r. New `caller' builtin to provide a call stack for the bash debugger.

    t. `for', `select', and `case' command heads are printed when `set -x' is enabled.

    u. There is a new {x..y} brace expansion, which is shorthand for {x.x+1, x+2,...,y}. x and y can be integers or single characters; the sequence may ascend or descend; the increment is always 1.

    v. New ksh93-like ${!array[@]} expansion, expands to all the keys (indices) of array.

    z. New `-o plusdirs' option to complete and compgen; if set, causes directory name completion to be performed and the results added to the rest of the possible completions.

    ee. Subexpressions matched by the =~ operator are placed in the new BASH_REMATCH array variable.

    gg. New `set -o pipefail' option that causes a pipeline to return a failure status if any of the processes in the pipeline fail, not just the last one.

    kk. The `\W' prompt expansion now abbreviates $HOME as `~', like `\w'.

    ll. The error message printed when bash cannot open a shell script supplied as argument 1 now includes the name of the shell, to better identify the error as coming from bash.

    2. New Features in Readline

    a. History expansion has a new `a' modifier equivalent to the `g' modifier for compatibility with the BSD csh.

    b. History expansion has a new `G' modifier equivalent to the BSD csh `g'
    modifier, which performs a substitution once per word.

    c. All non-incremental search operations may now undo the operation of replacing the current line with the history line.

    d. The text inserted by an `a' command in vi mode can be reinserted with `.'.

    e. New bindable variable, `show-all-if-unmodified'. If set, the readline completer will list possible completions immediately if there is more than one completion and partial completion cannot be performed.

    g. History list entries now contain timestamp information; the history file functions know how to read and write timestamp information associated with each entry.

    n. When listing completions, directories have a `/' appended if the `mark-directories' option has been enabled.

    Not much changed (Score:5, Insightful)
    by opk (149665) on Thursday July 29, @10:28AM (#9831965)
    Doesn't seem to be much changed given the version number increase. [[ =~ ]] can match regexes and it can do zsh style {1..3} expansions. Improved multibyte support too. There were bigger changes in some of the 2.0x updates.
    • Re:First "zsh rules" post! by Anonymous Coward (Score:3) Thursday July 29, @10:30AM
        Re:First "zsh rules" post! (Score:5, Informative)
        by opk (149665) on Thursday July 29, @10:46AM (#9832230)
        Globs are more powerful: **/*.c will recursively search for .c files: much quicker to type than find.
        You can match file types: e.g. *(@) will get you symlinks. *(U) gets files owned by you.

        Syntax for alternation is a lot easier. No @(this|that) or !(*.f). Instead, it is (this|that) and ^*.f

        Next point is completion. It includes a vast range of definitions so completion works well for lots of commands. The completion system handles completing parts of words so it better handles user@host completion. You get descriptions with completion match listings. Completion also has a really powerful context sensitive configuration system so you can make it work the way you like.

        It has modules. For running a simple shell script it will actually use less space than bash because it doesn't need to load the line editor and other interactive related code into memory.

        There is much much more. It takes a while to learn everything but if you just enable the completion functions (autoload -U compinit; compinit) you'll find it better than bash or tcsh from day 1.

    Re:Just wondering... (Score:5, Informative)
    by opk (149665) on Thursday July 29, @11:05AM (#9832448)
    Zsh is still the best.

    Bash developers have different priorities.
    Bash became the default primarily because it is GNU.
    Zsh has some ugly but powerful features like nested expansions. The two areas where bash is better than zsh is multibyte support and POSIX compliance. Much of that was contributed by IBM and Apple respectively. But if you use the shell a lot, you'll find zsh does a lot of things better. The completion is amazing. And when it isn't emulating sh/posix, it fixes some of the broken design decisions (like word splitting of variables) which saves you from doing stupid things.

    The FSF actually does development in a very closed manner when it can (the gcc egcs split was partly because of this). Bash is a good example of this. That perhaps a good thing because it is probably good that bash doesn't get some of zsh's nasty (but powerful) features. And if zsh didn't exist, bash might have been forked by now. If you care about your shell, you'll find much more of a community on the zsh lists than the spam filled bug-bash list. You can't even get at alpha releases of bash without being one of the chosen few.

    Can arrow key history be like Matlab's? (Score:3, Interesting)
    by dara (119068) on Thursday July 29, @10:54AM (#9832323)
    I read the announcement and it mentions "History traversal with arrow keys", but what I would really like doesn't seem to be mentioned (but perhaps it is possible with bash-2.05, I'm not much of a shell expert). In Matlab, the up-arrow key searches the history for commands that match all the characters on the line. No characters and it acts like a normal bash arrow, if "figure, plot" is at the beginning of the line, it will quickly scroll through all plotting commands that have been entered at the shell.

    Any idea if this is possible?

    Dara Parsavand

    This discussion has been archived. No new comments can be posted.
    Bash 3.0 Released | Log in/Create an Account | Top | 507 comments | Search Discussion


    The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
    Re:Can arrow key history be like Matlab's? (Score:2)
    by Atzanteol (99067) on Thursday July 29, @11:10AM (#9832500)
    Try 'ctrl+r'. Not *exactly* what you're looking for, but it lets you search through your history.

    i.e. on an empty line hit ctrl+r, then start typing.

    [ Parent ]
    Re:Can arrow key history be like Matlab's? (Score:3, Informative)
    by Anonymous Coward on Thursday July 29, @12:38PM (#9833770)
    cat >> ~/.inputrc
    "\e[A": history-search-backward
    "\e[B": history-search-forward
    Re:Can arrow key history be like Matlab's? (Score:1, Informative)
    by Anonymous Coward on Thursday July 29, @01:17PM (#9834391)
    Here you are,put this in .inputrc:

    "\e[6~": history-search-forward
    "\e[5~": history-search-backward

    and use page-up pagge-down to search

    I can live whithout it since 4dos

    Autocompletion like W2K? (Score:2)
    by tstoneman (589372) on Thursday July 29, @01:12PM (#9834316)
    I like bash, but the one thing that it doesn't support (out-of-the-box anyway) is auto-completion a la W2K. In NT, when you hit tab, you can cycle through all the words that can complete the letters you typed... on bash, it shows you a list.

    Is there a way to make bash behave more like W2K in this sense?

    bash completion getting better (Score:3, Informative)
    by AT (21754) on Thursday July 29, @11:08AM (#9832478)
    The completion ability of bash has been steadily improving. There is a nice script here [] that sets up a lot of good completion rules for bash.
    Re:First "zsh rules" post! (Score:2)
    by Just Some Guy (3352) <`kirk+slashdot' `at' `'> on Thursday July 29, @05:30PM (#9837797)
    ( | Last Journal: Monday September 27, @09:09AM)
    That's just... I mean... Wow. Really. I just patched my .zshenv with

    - export SSH_AUTH_SOCK=$(find 2>/dev/null /tmp -maxdepth 2 -type s -user $USER -regex '/tmp/ssh-.*/agent\..*')
    + export SSH_AUTH_SOCK=$(echo /tmp/ssh-*/agent.*(=UN))

    That's sweet. Thanks for the pointer! The more I learn about zsh, the more I love it.

    Re:First "zsh rules" post! (Score:1)
    by sorbits (516598) on Saturday July 31, @01:09PM (#9853305)
    I will certainly give it a try then!

    Until now I have sticked with tcsh for one single reason: history substition []!

    Basically it lets me insert text from my history (including the current line) using few symbols (e.g. !$ is the last argument of the previous line) -- it's extremely powerful, e.g. it allows to search in the history and can do substitutions in results, or head/tail for paths etc.

    I use it a lot to keep down the number of characters I need to type, and I have even assigned hotkeys to some of the substitutions I use the most.

    This is really the make-or-break feature for wether or not I want to use a shell, so I really hope zsh has something similar!?!

    Re:First "zsh rules" post! (Score:5, Informative)
    by Just Some Guy (3352) <`kirk+slashdot' `at' `'> on Thursday July 29, @10:50AM (#9832279)
    ( | Last Journal: Monday September 27, @09:09AM)
    Bigs ones for me:
    • A sane auto-completion system. That is, "cvs <tab>" gives a list of all of the commands that cvs understands. "cvs -<tab>" (same as above but tabbing after typing "-") gives a list of all of the options that cvs understands. These are good things. Now, in fairness, bash also has a command completion library. Unfortunately, it's implemented as a huge set of Bash functions. In zsh, "set|wc" returns 179 lines. In bash, "set|wc" returns 3,961 lines. The net effect is that zsh's system is noticeably faster and less polluting to the local environment.
    • Modules. Wrappers for TCP connections, a built-in cron thingy, and PCRE are all loadable modules to do tricky things easily.
    • Lots of pre-defined things. Load the "colors" and "zsh/terminfo" modules and you get defined associative arrays like $fg, which emits terminal-appropriate escape codes to set the foreground color of printed text. The command "echo ${fg[red]}red text${fg[default]}normal text" prints "red text" in red, and "normal text" in your default color.

    Bash is a good shell, and I have nothing bad to say about it. However, zsh seems to have been designed from the ground up by power users and for power users. I absolutely love it and everyone that I've given a example config file to (to get them running with little hassle) has permanently switched.

    Re:First "zsh rules" post! (Score:5, Informative)
    by Just Some Guy (3352) <`kirk+slashdot' `at' `'> on Thursday July 29, @11:21AM (#9832638)
    ( | Last Journal: Monday September 27, @09:09AM)
    As the maintainer of FreeBSD's bash-completion [] port, I'm reasonably familiar with it. Yes, it's approximately as powerful as zsh's completion module. Still, have you ever looked at it? It's a giant set of defined functions and glue. Seriously, get to a bash prompt and type "set" to see all of the things that've been stuffed into your shell's namespace. Now, try that with zsh and be pleasantly surprised.

    As I said in another post, a big side effect is that zsh's completions seem to be much faster than bash's. That alone is worth the price of admission for me.

    Re:Dear Apple haters... (Score:5, Informative)
    by Jahf (21968) on Thursday July 29, @10:58AM (#9832361)
    (Last Journal: Thursday August 05, @03:55PM)
    Believe it or not, -most- of the large companies that use GPL'ed tools give back to the community.

    Apple has done numerous fixes, not just on BASH.

    Sun (disclaimer: for whom I work) has done -tons- of work on GNOME, Mozilla and don't forget Open Office (just to name a few).

    IBM works on many projects and gives back ... plus contributing all new things like JFS.

    All the distro makers like Red Hat, Novell, etc give back tons.

    Each of those companies pay engineers to fix pieces not done in Open Source projects as well as to extend them for their customers. The patches are covered under GPL just like the main code, and these companies know it and yet knowingly dedicate serious money and hours to these projects. And then they satisfy the GPL by putting them out on source CDs or submitting them back to the main projects.

    The big problem for getting submitted code accepted is that these companies are usually fixing and developing on a codebase that is aging. For instance, Sun did numerous I18N fixes for GNOME 2.6, but by the time they were ready the main GNOME organization had moved on to 2.8. That means there is a disconnect between the two and the changes have to be ported forward before they will hit the main code branch. The same problem can happen with kernel patches and just about any other codebase that changes versions so quickly.

    Sorry, you were doing the good thing and pointing out Apple's contributions. But so many people think these companies violate the GPL (in spirit if not in law) when they are very large contributors to open source. Sure, some do, and the community usually find out about it and shame them into minimal compliance (Linksys and Sveasoft come to mind after my delving into alternate WRT54G firmwares last night), but generally speaking the big companies have been a good part of the community.

    Re:On the list of changes: (Score:5, Informative)
    by Prowl (554277) on Thursday July 29, @11:12AM (#9832526)
    GNU or Unix would seem to be the most appropriate

    bash has been around since 1989 (according to copywrite on man page). Linux 1.0 came around 5 years later.

    The editors should know better, unless they're intentionally trying to piss off RMS

    Looks great, but prefer Ash for scripts (Score:3, Interesting)
    by Etcetera (14711) * <<cleaver> <at> <>> on Thursday July 29, @10:41AM (#9832171)

    Looks like a nice Unicode-savvy release that should help with dealing with international languages at the command line. And yay to Apple for giving back (again). When will people finally accept that Apple is indeed helping out the OSS community through GCC, bash, and other tools...?

    Kind of off-topic, but for speed purposes in scripts that have to run fast, I find nothing better or more convenient than Ash, especially on systems where /bin/sh is a symlink to /bin/bash.

    Does anyone know any history on this shell? Is it a clone of the original bourne shell or of bash? I can't seem to find anything useful on Google ...

    Re:Looks great, but prefer Ash for scripts (Score:2)
    by mihalis (28146) on Thursday July 29, @10:52AM (#9832301)
    As I understand it, ash was written by Kenneth Almquist. I used to see his name on some of the Ada related mailing lists and newsgroups.
    Re:Looks great, but prefer Ash for scripts (Score:2, Informative)
    by Stephen Williams (23750) on Thursday July 29, @11:46AM (#9832931)
    ( | Last Journal: Thursday December 05, @05:02AM)
    Ash (or dash, as it's called nowadays) is a Linux port of NetBSD's /bin/sh. It's a POSIX shell with separate lineage from bash.

    It's /bin/sh on my system too. Faster and smaller than bash; watch those configure scripts fly!


    Some people don't agree. (Score:2)
    by emil (695) on Thursday July 29, @12:48PM (#9833940)
    Ash appears to consume large amounts of memory, and some people in BSD circles have serious objections to it.

    See the discussion here [] (scroll down a bit into the postings). I don't have an opinion on the issue one way or another.

    And can I set up bash so I can, for instance, move from rc2.d to rc3.d by typing

    $ cd 2 3

    [Oct 9, 2004] CGI-Shell

    CGI-Shell provides a shell using CGI, so everyone who has a CGI directory on a Web server can also have a shell on it. It's comparable to telnet or SSH.

    [Sep 30, 2004] epto.

    epto is a small library and framework for industrial strength shell script programming with sh.

    epto is a small library and framework for industrial strength shell script programming with sh. It features convenient error handling, tracing, logging, option handling, documentation template, process level transaction safety (sort of), and more. If one is used to shell programming, it takes less than five minutes of learning to start using it (see the crash course in the README file).


    NanoBlogger is a small Weblog engine that is written in bash and makes use of some common UNIX tools, such as cat, grep, and sed. It creates static HTML content and features a slick command line interface, templates and CSS style sheets, plugins, permalinks, and archiving by category, entry, and month. It's intended to be modular and flexible in design and easy to integrate into existing Web sites. There's no need for any JavaScript, server-side scripting, server-side includes, or database.


    MicroBlogger is a fast, flexible, secure, reliable, and bloat-free weblog engine written entirely in Bash script. It is designed to have absolutely no dependencies on any outside programs--no PHP, no SQL, no CGI, no CSS, no JavaScript, no Perl, no XML, no DHTML, nothing. MicroBlogger is completely self-contained and self-regulating, and is ready for you to use straight out of the box. MicroBlogger has become a favorite of those who don't want the bloat and confusion normally associated with other weblog engines. Its advantages include speed, an easy to use menu subsystem, and an incredibly small footprint. A fully working MicroBlogger site, complete with backend, can occupy as little as 10KB of storage space.

    Bowie J. Poag <bpoag [at] comcast [dot] net> [contact developer

    [Sep 30, 2004] Fast OnlineUpdate for SuSE

    Fast OnlineUpdate for SuSE (fou4s) is a bash script that provides the functionality of YOU (YaST OnlineUpdate), but can also work in background and check for updates every night. It supports resumed downloads and proxies by using wget. GPG signatures are also checked.

    [Sep 30, 2004] Project details for rpm-get

    rpm-get is a simple clone of apt-get for RPM. This little application will let you download, install, upgrade, or remove RPM packages/sources as well as upgrade the entire distribution with ease.

    [Sep 30, 2004] Project details for IANA -etc Files

    iana-etc installs /etc/services and /etc/protocols using data from the Internet Assigned Numbers Authority. Included are snapshots of the data from the IANA, scripts to transform that data into the needed formats, and scripts to fetch the latest data.

    [Aug 3, 2004] Project details for For each File

    For each File allows users to execute an arbitrary set of GNU BASH shell commands upon an arbitrary set of files. It is useful for, but not limited to, performing repetitive filesystem manipulation procedures such as mass file renaming and gathering statistics.

    ["] Re: Advantages over find?
    by Suraj N. Kurapati - Mar 16th 2004 10:46:25
    > What's the advantage of using this
    > instead of find -exec?

    In comparison to using 'find -exec', key advantages are:
    - better performance (less system resources used)
    - flexibility (user can perform tasks without a separate script)
    - simplicity (saves time and effort)

    In order to justify these claims, consider the following cases.

    (1) Case 1: Executing an existing script.

    For every file it handles, 'find -exec' forks a separate child-process to execute the user's commands. On a large input size, this behavior yields poor performance.

    This tool uses less system resources, and thereby yields better performance, by employing only one Bash process (in both non/recursive modes). Separate child-processes are not forked because user commands are evaluated by the same Bash process.

    This tool also provides preset scripting variables to save users' time. Although it is a trivial matter to assemble these variables within a script, it becomes wasteful and tedious for quick tasks.

    (2) Case 2: Performing complex tasks at the terminal without a separate script.

    As the complexity of the problem increases, using 'find -exec' to implement solutions at the terminal becomes cryptic and time consuming. One possibility is to write a separate script and use it with 'find -exec'. In which case, this tool already puts the power of the Bash shell at the user's disposal, without having to manually create a separate script each time.

    Using this tool, the user can, but is not forced to, enter scripting commands on one line. Entering commands is often made more efficient when this tool is launched with the editor option. This option allows the user to enter scripting commands via their favorite text editor, rather than fiddling with a one-line script at the terminal.

    Consider the following examples.

    (2a) Example A: We wish to extract all tar-ball files in the current working directory.

    $ ls
    bar.tgz foo.tgz

    (2a-1) Using this tool:

    $ ff *tgz
    tar -zxf "$f"

    (2a-2) Using 'find -exec':

    $ find . -name '*tgz' -exec tar -zxf '{}' ';'

    (2b) Example B: We wish to rename the file extension 'tgz' to 'tar.gz' and display the original filename.

    $ ls some_tgz/tgz_dir/
    bar.tgz b a z.tgz foo.tgz

    (2b-1) Using this tool:

    $ ff some_tgz/tgz_dir/*tgz
    mv "$f" "$d/${fn/%tgz/tar.gz}"
    echo $fn

    (2b-2) Using 'find -exec':

    $ find some_tgz/tgz_dir/ -name '*tgz' -exec sh -c 'mv "{}" `dirname "{}"`/`basename "{}" | sed "s/tgz/tar.gz/"`; echo {}' \;

    Notice that we invoked a shell with 'find -exec' for this task. It is possible to perform the task without the aid of an intermediate shell, but one can imagine how time consuming and error prone that would be.

    (3) In conclusion, this tool gives the user flexibility while saving them time and maintaining low usage of system resources.

    [Jul 14, 2004] Project details for GNU shtool

    GNU shtool is a compilation of small but very stable and portable shell scripts into a single shell tool. All ingredients were in successful use over many years in various free software projects. The compiled shtool program is intended to be used inside the source tree of free software packages. There it can overtake various (usually non-portable) tasks related to the building and installation of an free software package.

    [May 16, 2004] Project details for BASH Debugger

    BASH Debugger provides a patched BASH that enables better debugging support as well as improved error reporting. It also contains the most comprehensive source code debugger for BASH that has been written. It can be used as a springboard for other experimental features (such as a timestamped history file), since development is maintained openly and developers are encouraged to participate.

    [Feb 10, 2004] Project details for Tiger security tool

    TIGER is a set of Bourne shell scripts, C programs, and data files which are used to perform a security audit of Unix systems. The security audit results are useful both for system analysis (security auditing) and for real-time, host-based intrusion detection.

    [Jan 23, 2004] Project details for bash menubuilder

    Bash menubuilder creates pretty bash pseudoGUIs which act as templates for shell scripts. It features a menu editor with preview which can save and load config files which can be translated with a shell script compiler to the final menu.

    [Jan 23, 2004] Project details for Miscellaneous Unix scripts

    This is a collection of (mostly) Unix scripts for general use. The package consists of scripts to aid system administration, text processing, and more. They can mount devices via a menu, open documents in new Netscape windows, justify ASCII text, split files into many parts and restore then, print PS files with 4 pages per sheet, convert encodings and UTF-8, trim Unix mailboxes, adjust system time, and maintain databases of files on media.

    Scripting GNU in the 21st Century Linux Journal by Nick Moffitt

    Example of bash script that fetch train schedule from the website

    Most people tend to encounter shell scripts as attempts to write portable utilities that run properly on most, if not all, UNIX workalikes. Instead of making the best use of the shell and related programs, these scripts restrict themselves to the absolute most portable baseline of features. The extreme example of this sort of programming can be seen when one looks at the configure scripts generated by the autoconf program.

    But this is 2004, and the full GNU environment is now commonplace. The advanced shells and utilities now are the default user environment for GNU/Linux systems, and they are available as install options on BSD-based systems. Even proprietary UNIXes often have complete sets of GNU software laid atop them to bring them up to date with the modern world. Because GNU software can be obtained at little or no cost, there is no excuse to continue scripting in a retrograde proprietary environment.

    Re: Scripting GNU in the 21st Century

    Submitted by Anonymous on Sun, 2004-05-30 23:00.

    Overall, for someone learning bash, this is probably a reasonable example. I have many similar ones myself (grabbing satellite wildfeed data, for example).

    However, as a means of introducing a newcomer to bash, or as a convincing description of why the newcomer should be using bash, I feel it falls short.

    For example, some simple timing shows that his "$(basename $0)" construct is almost 100x slower than using "${0##*/}", although he does use another version of the same construct later, meaing that he is aware of it!

    His repeated use of the backslash character as a line continuation does not improve the readability of the script; in fact, it makes it worse. Leave it out! Yes, it still works -- if the line ends with a character which indicates there's more needed, the line continuation character is redundant. The pipe symbol (vertical bar) is such a character.

    In general, all variable usage should be enclosed in double quotes (ie, "Dollar signs in Double quotes"). This technique is only wrong 1 time out of 100, so the programmer will be correct 99% of the time. :) Yes, it may mean the double quotes are redundant in some cases, but there's a lot to be said for consistency, and hence, readability. Only when dealing with word splitting (where you want the word split), will the double quotes be incorrect.

    He appears to use brackets in "if" statements when the POSIX (?) technique would be double parentheses (brackets are for string comparisons, parens are for numeric comparisons, and the doubled form of each is recommended since they turn off I/O redirection, wildcarding, and word splitting). Maybe he doesn't know?

    Lastly, the "liststations" function seems overly complex to me. First, is it really necessary to specify the entire XPath all the way from the "html" element down to the "select" element and its attribute?? I haven't seen the data, but I'd be willing to bet that just the "select" element and attribute would be enough (since select is non-functional outside of forms anyway, and the attribute specifies the name of the select element!). Regardless of whether simplification is possible, change the delimiter of the regexpr! Use something other than a slash and avoid LTS ("leaning toothpick syndrome", per Larry Wall). With the text thus cleaned up visually, maybe let sed also do the elimination of text up to and including the equals sign? That eliminates the need for cut, although it may hurt readability. Additionally, sed can also replace the while loop; tell sed to match on the "select" element and attribute, then read three more lines into the holding space, appending them to whats already there. Now run a substitution on the hold space and print the result. (Or if you're not comfortable with sed, use awk, and you still eliminate the cut and while loop.) YMMV.

    Overall, I agree with the other comment posted here that the standard for scripts should be POSIX, not a particular tool. Of course, POSIX has its own problems (quite a few, actually!), but that's a decision that individual organizations need to make: portability vs. speed/usability.

    Top Visited
    Past week
    Past month



    FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes.   If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner. 

    ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.  


    Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy


    War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes


    Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law


    Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

    Classic books:

    The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Haterís Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

    Most popular humor pages:

    Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

    The Last but not Least

    Copyright © 1996-2016 by Dr. Nikolai Bezroukov. was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.

    The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

    Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

    FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

    This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

    You can use PayPal to make a contribution, supporting development of this site and speed up access. In case is down you can use the at


    The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

    Last modified: September, 12, 2017