|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
|
|
General form a shell command (or shell language statement, if we try to be more formal) is as follows:
command options arguments
Command can internal or external:
Shell regular expressions are of two types:
Primitive regular expressions metacharacers are as following:
*
-- matches a string of zero or more characters
?
-- matches exactly one character [
abc
...] -- matches class of characters defined in sqare brackets.
[
a
-
z
]
[!
abc
...]
[!
a
-
z
]
For more information see Regular Expressions
Bash 3.2 introduced "normal " regular expression and operator =~ to use them
Born-shell family has special macros that expand to the home directory. They can be used only outside strings:
~
The home directory of the current user ~
userid
-- home directory of the user with userid
ID. ~+
The current working directory ~-
The previous working directory Shell aliases are parameteless positional macros. They are recognized only if they are the first in the command string. General form
alias name='command'
alias
command without parameters lists all active aliases.
You should generally avoid self-aliased commands. But in some cases they
are useful. For example, the command rm -i is often aliased as
rm. The effect is that the -i option appears
whenever you issue the rm command, whether or not you type the
option. The -i option specifies that the shell will
prompt for confirmation before deleting files. This helps avoid accidental
deletion of files, which can be particularly hazardous when you're logged
in as root
. The alias ensures that you're prompted for confirmation
even if you don't ask to be prompted. If you don't want to be prompted,
you can issue a command like:rm -f files
where files
specifies the files
to be deleted. The -f option has an effect opposite
that of the -i option; it forces deletion of files
without prompting for confirmation. Because the command is aliased, the
command actually executed is:
rm -i -f files
Here the -f option takes precedence over the -i option, because it occurs later in the command line.
If you want to remove a command alias, you can issue the unalias command:
unalias alias
where alias
specifies
the alias you want to remove. Aliases last only for the duration of a log
in session, so you needn't bother to remove them before logging off. If
you want an alias to be effective each time you log in, you can use a shell
script. The next subsection shows you how to do so.
A shell script is simply a file that contains commands. By storing commands as a shell script you make it easy to execute them again and again. As an example, consider a file named deleter, which contains the following lines:
echo -n Deleting the temporary files... rm -f *.tmp echo Done.
The echo commands simply print text on the console. The -n option of the first echo command causes omission of the trailing newline character normally written by the echo command, so both echo commands write their text on a single line. The rm command removes from the current working directory all files having names ending in .tmp.
You can execute this script by issuing the sh command:
sh deleter
If you invoke the sh command without an argument specifying a script file, a new interactive shell is launched. To exit the new shell and return to your previous session, issue the exit command.
If the deleter file were in a directory other than the current working directory, you'd have to type an absolute path, for example:
sh /home/bill/deleter
You can make it a bit easier to execute the script by changing its access mode to include execute access. To do so, issue the following command:
chmod ugo+x deleter
This gives you, members of your group, and everyone else the ability to execute the file. To do so, simply type the absolute path of the file, for example:
/home/bill/deleter
If the file is in the current directory, you can issue the following command:
./deleter
You may wonder why you can't simply issue the command:
deleter
In fact, this still simpler form of the command will work, so long as deleter resides in a directory on your search path. You'll learn about the search path later.
Korn shell uses several configuration scripts that are run at the begining of the user session.
BASH can also use bash-specific shells:
See Dot files for more information. Among them
The shell provides three standard data streams:
stdin
The standard input stream stdout
The standard output stream stderr
The standard error stream By default, most programs read their input from stdin
and
write their output to stdout
. Because both streams are normally
associated with a console, programs behave as you generally want, reading
input data from the console keyboard and writing output to the console screen.
When a well-behaved program writes an error message, it writes the message
to the stderr
stream, which is also associated with the console
by default. Having separate streams for output and error messages presents
an important opportunity, as you'll see in a moment.
Although the shell associates the three standard input/output streams with the console by default, you can specify input/output redirectors that, for example, associate an input or output stream with a file:
>
file
Redirects
standard output stream to specified file file
>>
file
2>>
file
Redirects
standard error stream to specified file, appending output to the file
if the file already exists &>
file
file
<<
text
Reads
standard input until a line matching text
is found, at which point end of file is posted |
cmd2
Takes the standard input of cmd2
from the standard output of cmd1
(also known as the pipe redirector)To see how redirection works, consider the wc command on the console.
Perhaps you can now see the reason for having the separate output streams
stdout
and stderr
. If the shell provided a single
output stream, error messages and output would be mingled. Therefore, if
you redirected the output of a program to a file, any error messages would
also be redirected to the file. This might make it difficult to notice an
error that occurred during program execution. Instead, because the streams
are separate, you can choose to redirect only stdout
to a file.
When you do so, error messages sent to stderr
appear on the
console in the usual way. Of course, if you prefer, you can redirect both
stdout
and stderr
to the same file or redirect
them to different files. As usual in the Unix world, you can have it your
own way.
A simple way of avoiding annoying output is to redirect it to the null
file, /dev/null. If you redirect the stderr
stream of
a command to /dev/null, you won't see any error messages the command
produces.
Just as you can direct the standard output or error stream of a command
to a file, you can also redirect a command's standard input stream to a
file, so that the command reads from the file instead of the console. For
example, if you issue the wc command without arguments, the command
reads its input from stdin
. Type some words and then type the
end of file character (Ctrl-D) and wc will report the number
of lines, words, and characters you entered. You can tell wc
to read from a file, rather than the console, by issuing a command like:
wc </etc/passwd
Of course, this isn't the usual way of invoking wc. The author of wc helpfully provided a command-line argument that lets you specify the file from which wc reads. However, by using a redirector, you could read from any desired file even if the author had been less helpful.
Some programs are written to ignore redirectors. For example, the passwd command expects to read the new password only from the console, not from a file. You can compel such programs to read from a file, but doing so requires techniques more advanced than redirectors.
When you specify no command-line arguments, many Unix programs read their
input from stdin
and write their output to stdout
.
Such programs are called filters. Filters can be easily fitted
together to perform a series of related operations. The tool for combining
filters is the pipe, which connects the output of one program to
the input of another. For example, consider this command:
ls -l ~ | wc -l
The command consists of two commands, joined by the pipe redirector (
|
). The first command lists the names of the files in the users home
directory, one file per line. The second command invokes wc by
using the -l option, which causes wc to
print only the total number of lines, rather than printing the total number
of lines, words, and characters. The pipe redirector sends the output of
the ls command to the wc command, which counts and
prints the number of lines in its input, which happens to be the number
of files in the user's home directory.
This is a simple example of the power and sophistication of the Unix shell. Unix doesn't include a command that counts the files in the user's home directory and doesn't need to do so. Should the need to count the files arise, a knowledgeable Unix user can prepare a simple script that computes the desired result by using general-purpose Unix commands.
If you've studied programming, you know that programming languages resemble algebra. Both programming languages and algebra let you refer to a value by a name. And both programming languages and algebra include elaborate mechanisms for manipulating named values.
The shell is a programming language in its own right, letting you refer to variables known as shell variables or environment variables. To assign a value to a shell variable, you use a command that has the following form: variable=value
For example, the command:
PATH=/usr/bin:/usr/sbin/:usr/local/bin
assigns the value
/usr/bin:/usr/sbin/:usr/local/bin
to the shell variable named PATH. Shell variables are typeless and can have both athithmentic and non-numeric values.
Shell variables are widely used within shell scripts, because they provide a convenient way of transferring values from one command to another. Programs can obtain the value of a shell variable and use the value to modify their operation, in much the same way they use the value of command-line arguments.
You can see a list of shell variables by issuing the env command. Usually, the command produces more than a single screen of output. So, you can use a pipe redirector and the more command to view the output one screen at a time:
env | more
Press the Space bar to see each successive page of output. You'll probably see several of the shell variables described below:
localhost:0
You can use the value of a shell variable in a command by preceding the name of the shell variable by a dollar sign ($). To avoid confusion with surrounding text, you can enclose the name of the shell variable within curly braces ({}); it's good practice (though not necessary) to do so consistently. For example, you can change the current working directory to your home directory by issuing the command:
cd ${HOME}
An easy way to see the value of a shell variable is to specify the variable
as the argument of the echo command. For example, to see the
value of the PATH
shell variable, issue the command:
echo $PATH
To make the value of a shell variable available not just to the shell, but to programs invoked by using the shell, you must export the shell variable. To do so, use the export command, which has the form:
export variable
where variable
specifies the
name of the variable to be exported. A shorthand form of the command lets
you assign a value to a shell variable and export the variable in a single
command:
export variable=value
You can remove the value associated with shell variable by giving the variable an empty value:
variable=
However, a shell variable with an empty value remains a shell variable and appears in the output of the set command. To dispense with a shell variable, you can issue the unset command:
unset variable
Once you unset the value of a variable, the variable no longer appears in the output of the set command.
The special shell variable PATH
holds a series of paths
known collectively as the search path. Whenever you issue an external
command, the shell searches paths that comprise the search path, seeking
the program file that corresponds to the command. The startup scripts establish
the initial value of the PATH
shell variable, but you can modify
its value to include any desired series of paths. You must use a colon (:)
to separate each path of the search path.
For example, suppose that PATH has the following value:
/usr/bin:/bin:/usr/local/bin:/usr/bin/X11:/usr/X11R6/bin
You can add a new search directory, say /opt/bin, with the following command:
PATH=$PATH:/opt/bin/
Now, the shell will look for external programs in /opt/bin/ as well as the default directories. However, it will look there last. If you prefer to check /opt/bin first, issue the following command instead:
PATH=/opt/bin:$PATH
The which command helps
you work with the PATH
shell variable. It checks the search
path for the file specified as its argument and prints the name of the matching
path, if any. For example, suppose you want to know where the program file
for the wc command resides. Issuing the command:
which wc
will tell you that the program file is /usr/bin/wc, or whatever other path is correct for your system.
Enclosing a command argument within single quotes, you can prevent the shell from expanding any special characters inside this string.
'
"
`
\
To see this in action, consider how you might cause the echo
command to produce the output $PATH
. If you simply
issue the command:
echo $PATH
the echo command will print the value of the PATH
shell variable. However, by enclosing the argument within single quotes,
you obtain the desired result:
echo '$PATH'
Double quotes permit the expansion of shell variables.
Back quotes operate differently; they let you execute a command and use its output as an argument of another command. For example, the command:
echo My home directory contains `ls ~ | wc -l` files.
prints a message that gives the number of files in the user's home directory. The command works by first executing the command contained within back quotes:
ls ~ | wc -l
This command, as explained earlier, computes and prints the number of files in the user's directory. Because the command is enclosed in back quotes, its output is not printed; instead the output replaces the original back quoted text. When executed, this command prints the output:
My home directory contains 22 files.
This section explains how more advanced shell scripts work. The information is also adequate to equip you to write many of your own useful shell scripts.
The ((...)) command is equivalent to the let command, except that all characters between the (( and )) are treated as quoted arithmetic expressions. This is more convenient to use than let, because many of the arithmetic operators have special meaning to the Korn shell. The following commands are equivalent:
$ let "X=X + 1"
and
$ ((X=X + 1))
$ X=`expr $X + 1`
In tests on a few systems, the let command performed the same operation 35-60 times faster! That is quite a difference.
You can easily write scripts that process arguments, because a set of special shell variables holds the values of arguments specified when your script is invoked.
For example, here's a simple one-line script that prints the value of its second argument:
echo My second argument has the value $2.
Suppose you store this script in the file second, change its access mode to permit execution, and invoke it as follows:
./second a b c
The script will print the output:
My second argument has the value b.
$0
The command name. $1
,
$2
,
... , $9
The individual arguments of the command. $*
The entire list of arguments, treated as a single word.
$@
The entire list of arguments, treated as a series of
words.$?
The exit status of the previous command. The value
0 denotes successful completion. $$
he process id of the
current process.
Notice that the shell provides variables for accessing only nine
arguments. Nevertheless, you can access more than nine arguments. The
key to doing so is the shift command, which discards the
value of the first argument and shifts the remaining values down one
position. Thus, after executing the shift command, the shell
variable $9
contains the value of the tenth argument. To
access the eleventh and subsequent arguments, you simply execute the
shift command the appropriate number of times.
The shell variable $?
holds the numeric exit status
of the most recently completed command. By convention, an exit status
of zero denotes successful completion; other values denote error conditions
of various sorts.
You can set the error code in a script by issuing the exit command, which terminates the script and posts the specified exit status. The format of the command is:
exit status
where status
is a non-negative
integer that specifies the exit status.
A shell script can employ conditional logic, which lets the script take different action based on the values of arguments, shell variables, or other conditions. The test command lets you specify a condition, which can be either true or false. Conditional commands (including the if, case, while, and until commands) use the test command to evaluate conditions.
The test command evaluates its arguments and sets the exit status to 0, which indicates that the specified condition was true, or a non-zero value, which indicates that the specified condition was false. Some commonly used argument forms used with the test command:
-d
file
-e
file
-r
file
-s
file
-w
file
-x
file
-L
file
f1
-nt
f2
f1
is newer than
file f2
.f1
-ot
f2
f1
is older than
file f2
.-n
s1
s1
has nonzero
length.-z
s1
s1
has zero length.s1
=
s2
s1
is the same
as string s2
.s1
!=
s2
s1
is not the
same as string s2
.n1
-eq
n2
n1
is equal
to integer n2
.n1
-ge
n2
n1
is greater
than or equal to integer n2
.n1
-gt
n2
n1
is greater
than integer n2
.n1
-le
n2
n1
is less than integer n2
.
n1
-lt
n2
n1
is less than
or equal to integer n2
.n1
-ne
n2
n1
is not equal
to integer n2
.!
not
operator, which reverses the value of the
following condition.-a
and
operator, which joins two conditions. Both
conditions must be true for the overall result to be true.-o
or
operator, which joins two conditions. If
either condition is true, the overall result is true.\( ... \)
\(
and \)
.To see the test command in action, consider the following script:
test -d $1 echo $?
This script tests whether its first argument specifies a directory and displays the resulting exit status, a zero or a non-zero value that reflects the result of the test.
Suppose the script were stored in the file tester, which permitted read access. Executing the script might yield results similar to the following:
$ ./tester / 0 $ ./tester /missing 1
These results indicate that the / directory exists and that the /missing directory does not exist.
The test command is not of much use by itself, but combined with commands such as the if command, it is useful indeed. The if command has the following form:
if command then commands else commands fi
Usually the command that immediately follows the word if is a test command. However, this need not be so. The if command merely executes the specified command and tests its exit status. If the exit status is 0, the first set of commands is executed; otherwise the second set of commands is executed. An abbreviated form of the if command does nothing if the specified condition is false:
if command then commands fi
When you type an if command, it occupies several lines; nevertheless it's considered a single command. To underscore this, the shell provides a special prompt (called the secondary prompt) after you enter each line. Often, scripts are entered by using a text editor; when you enter a script using a text editor you don't see the secondary prompt, or any other shell prompt for that matter.
As an example, suppose you want to delete a file
file1
if it's older than another
file file2
. The following command
would accomplish the desired result:
if test file1 -ot file2 then rm file1 fi
You could incorporate this command in a script that accepts arguments specifying the filenames:
if test $1 -ot $2 then rm $1 echo Deleted the old file. fi
If you name the script riddance and invoke it as follows:
riddance thursday wednesday
the script will delete the file thursday if that file is older than the file wednesday.
The case command provides a more sophisticated form of conditional processing:
case value in pattern1) commands;; pattern2) commands ;; ... esac
The case command attempts to match the specified value against a series of patterns. The commands associated with the first matching pattern, if any, are executed. Patterns are built using characters and metacharacters, such as those used to specify command arguments. As an example, here's a case command that interprets the value of the first argument of its script:
case $1 in -r) echo Force deletion without confirmation ;; -i) echo Confirm before deleting ;; *) echo Unknown argument ;; esac
The command echoes a different line of text, depending on the value of the script's first argument. As done here, it's good practice to include a final pattern that matches any value.
The while command lets you execute a series of commands iteratively (that is, repeatedly) so long as a condition tests true:
while command do commands done
Here's a script that uses a while command to print its arguments on successive lines:
echo $1 while shift 2> /dev/null do echo $1 done
The commands that comprise the do part of a while (or another loop command) can include if commands, case commands, and even other while commands. However, scripts rapidly become difficult to understand when this occurs often. You should include conditional commands within other conditional commands only with due consideration for the clarity of the result. Include a comment command (#) to clarify difficult constructs.
The until command lets you execute a series of commands iteratively (that is, repeatedly) so long as a condition tests false:
until command do commands done
Here's a script that uses an until command to print its arguments on successive lines, until it encounters an argument that has the value red:
until test $1 = red do echo $1 shift done
For example, if the script were named stopandgo and stored in the current working directory, the command:
./stopandgo green yellow red blue
would print the lines:
green yellow
The for command iterates over the elements of a specified list:
for variable in list do commands done
Within the commands, you can reference the current element of the
list by means of the shell variable $
variable
, where
variable
is the name specified following the for. The list typically
takes the form of a series of arguments, which can incorporate metacharacters.
For example, the following for command:
for i in 2 4 6 8 do echo $i done
prints the numbers 2, 4, 6, and 8 on successive lines.
A special form of the for command iterates over the arguments of a script:
for variable do commands done
For example, the following script prints its arguments on successive lines:
for i do echo $i done
The break and continue commands are simple commands that take no arguments. When the shell encounters a break command, it immediately exits the body of the enclosing loop ( while, until, or for) command. When the shell encounters a continue command, it immediately discontinues the current iteration of the loop. If the loop condition permits, other iterations may occur; otherwise the loop is exited.
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|
You can use PayPal to to buy a cup of coffee for authors of this site |
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Last modified: June 04, 2016