|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
|
|
The section was extracted from IBM LPI exam 102 prep, Topic 109 Shells, scripting, programming, and compiling by Ian Shields, 30 January 2007
This section covers material for topic 1.109.1 for the Junior Level Administration (LPIC-1) exam 102. The topic has a weight of 5.
In this section, learn how to:
Before the advent of graphical interfaces, programmers used a typewriter terminal or an ASCII display terminal to connect to a UNIX® system. A typewriter terminal allowed them to type commands, and the output was usually printed on continuous paper. Most ASCII display terminals had 80 characters per line and about 25 lines on the screen, although both larger and smaller terminals existed. Programmers typed a command and pressed Enter, and the system interpreted and then executed the command.
While this may seem somewhat primitive today in an era of drag-and-drop graphical interfaces, it was a huge step forward from writing a program, punching cards, compiling the card deck, and running the program. With the advent of editors, programmers could even create programs as card images and compile them in a terminal session.
The stream of characters typed at a terminal provided a standard input stream to the shell, and the stream of characters that the shell returned on either paper or display represented the standard output.
The program that accepts the commands and executes them is called a shell. It provides a layer between you and the intricacies of an operating system. UNIX and Linux shells are extremely powerful in that you can build quite complex operations by combining basic functions. Using programming constructs you can then build functions for direct execution in the shell or save functions as shell scripts so that you can reuse them over and over.
Sometimes you need to execute commands before the system has booted far enough to allow terminal connections, and sometimes you need to execute commands periodically, whether or not you are logged on. A shell can do this for you, too. The standard input and output do not have to come from or be directed to a real user at a terminal.
In this section, you learn more about shells. In particular, you learn about the bash or Bourne again shell, which is an enhancement of the original Bourne shell, along with some features from other shells and some changes from the Bourne shell to make it more POSIX compliant.
POSIX is the Portable Operating System Interface for uniX , which is a series of IEEE standards collectively referred to as IEEE 1003. The first of these was IEEE Standard 1003.1-1988, released in 1988. Other well known shells include the Korn shell (ksh), the C shell (csh) and its derivative tcsh, the Almquist shell (ash) and its Debian derivative (dash). You need to know something about many of these shells, if only to recognize when a particular script requires features from one of them.
Many aspects of your interaction with a computer will be the same from one session to another. Recall from the tutorial "LPI exam 101 prep (topic 103): GNU and UNIX commands" that when you are running in a Bash shell, you have a shell environment, which defines such things as the form of your prompt, your home directory, your working directory, the name of your shell, files that you have opened, functions that you have defined, and so on. The environment is made available to every shell process. Shells, including bash, allow you to create and modify shell variables, which you may export to your environment for use by other processes running in the shell or by other shells that you may spawn from the current shell.
Both environment variables and shell variables have a name. You reference the value of a variable by prefixing its name with '$'. Some of the common bash environment variables that are set for you are shown in Table 3.
Name | Function |
---|---|
USER | The name of the logged-in user |
UID | The numeric user id of the logged-in user |
HOME | The user's home directory |
PWD | The current working directory |
SHELL | The name of the shell |
$ | The process id (or PID of the running Bash shell (or other) process) |
PPID | The process id of the process that started this process (that is, the id of the parent process) |
? | The exit code of the last command |
In the Bash shell, you create or set a shell variable by typing a name followed immediately by an equal sign (=). Variable names (or identifiers) are words consisting only of alphanumeric characters and underscores, that begin with an alphabetic character or an underscore. Variables are case sensitive, so var1 and VAR1 are different variables. By convention, variables, particularly exported variables, are upper case, but this is not a requirement. Technically, $$ and $? are shell parameters rather than variables. They may only be referenced; you cannot assign a value to them.
When you create a shell variable, you will often want to export it to the environment so it
will be available to other processes that you start from this shell. Variables that you export are
not available to a parent shell. You use the export
command to export a variable
name. As a shortcut in bash, you can assign and export in one step.
To illustrate assignment and exporting, let's run the bash command while in the Bash shell and then
run the Korn shell (ksh) from the new Bash shell. We will use the ps
command to display
information about the command that is running.
[ian@echidna ian]$ ps -p $$ -o "pid ppid cmd" PID PPID CMD 30576 30575 -bash [ian@echidna ian]$ bash [ian@echidna ian]$ ps -p $$ -o "pid ppid cmd" PID PPID CMD 16353 30576 bash [ian@echidna ian]$ VAR1=var1 [ian@echidna ian]$ VAR2=var2 [ian@echidna ian]$ export VAR2 [ian@echidna ian]$ export VAR3=var3 [ian@echidna ian]$ echo $VAR1 $VAR2 $VAR3 var1 var2 var3 [ian@echidna ian]$ echo $VAR1 $VAR2 $VAR3 $SHELL var1 var2 var3 /bin/bash [ian@echidna ian]$ ksh $ ps -p $$ -o "pid ppid cmd" PID PPID CMD 16448 16353 ksh $ export VAR4=var4 $ echo $VAR1 $VAR2 $VAR3 $VAR4 $SHELL var2 var3 var4 /bin/bash $ exit $ [ian@echidna ian]$ echo $VAR1 $VAR2 $VAR3 $VAR4 $SHELL var1 var2 var3 /bin/bash [ian@echidna ian]$ ps -p $$ -o "pid ppid cmd" PID PPID CMD 16353 30576 bash [ian@echidna ian]$ exit [ian@echidna ian]$ ps -p $$ -o "pid ppid cmd" PID PPID CMD 30576 30575 -bash [ian@echidna ian]$ echo $VAR1 $VAR2 $VAR3 $VAR4 $SHELL /bin/bash |
Notes:
echo
command displayed values only for
VAR2, VAR3, and VAR4, confirming that VAR1 was not exported. Were you surprised to see that the
value of the SHELL variable had not changed, even though the prompt had changed? You cannot always
rely on SHELL to tell you what shell you are running under, but the ps
command does
tell you the actual command. Note that ps
puts a hyphen (-) in front of the first Bash
shell to indicate that this is the login shell.Listing 2 shows what you might see in some of these common bash variables.
[ian@echidna ian]$ echo $USER $UID ian 500 [ian@echidna ian]$ echo $SHELL $HOME $PWD /bin/bash /home/ian /home/ian [ian@echidna ian]$ (exit 0);echo $?;(exit 4);echo $? 0 4 [ian@echidna ian]$ echo $$ $PPID 30576 30575
You remove a variable from the Bash shell using the unset
command. You can use the
-v
option to be sure that you are removing a variable definition. Functions can have the
same name as variables, so use the -f
if you want to remove a function definition. Without
either -f
or -v
, the bash unset command removes a variable definition if it
exists; otherwise, it removes a function definition if one exists. (Functions are covered in more detail
later in the
Shell functions section.)
ian@attic4:~$ VAR1=var1 ian@attic4:~$ VAR2=var2 ian@attic4:~$ echo $VAR1 $VAR2 var1 var2 ian@attic4:~$ unset VAR1 ian@attic4:~$ echo $VAR1 $VAR2 var2 ian@attic4:~$ unset -v VAR2 ian@attic4:~$ echo $VAR1 $VAR2
The bash default is to treat unset variables as if they had an empty value, so you might wonder why
you would unset a variable rather than just assign it an empty value. Bash and many other shells allow
you to generate an error if an undefined variable is referenced. Use the command set -u
to generate an error for undefined variables, andset +u
to disable the warning. Listing
5 illustrates these points.
ian@attic4:~$ set -u ian@attic4:~$ VAR1=var1 ian@attic4:~$ echo $VAR1 var1 ian@attic4:~$ unset VAR1 ian@attic4:~$ echo $VAR1 -bash: VAR1: unbound variable ian@attic4:~$ VAR1= ian@attic4:~$ echo $VAR1 ian@attic4:~$ unset VAR1 ian@attic4:~$ echo $VAR1 -bash: VAR1: unbound variable ian@attic4:~$ unset -v VAR1 ian@attic4:~$ set +u ian@attic4:~$ echo $VAR1 ian@attic4:~$
Note that it is not an error to unset a variable that does not exist, even when set -u
has been specified.
When you log in to a Linux system, your id has a default shell, which is your login shell. If this shell is bash, then it executes several profile scripts before you get control. If /etc/profile exists, it is executed first. Depending on your distribution, other scripts in the /etc tree may also be executed, for example, /etc/bash.bashrc or /etc/bashrc. Once the system scripts have run, a script in your home directory is run if it exists. Bash looks for the files ~/.bash_profile, ~/.bash_login, and ~/.profile in that order. The first one found is executed.
When you log off, bash executes the ~/.bash_logout script from your home directory if it exists.
Once you have logged in and are already using bash, you may start another shell, called an interactive shell to run a command, for example to run a command in the background. In this case, bash executes only the ~/.bashrc script, assuming one exists. It is common to check for this script in your ~/.bash_profile, so that you can execute it at login as well as when starting an interactive shell, using commands such as those shown in Listing 6.
# include .bashrc if it exists if [ -f ~/.bashrc ]; then . ~/.bashrc fi
--login
option.
If you do not want to execute the profiles for a login shell, specify the --noprofile
option.
Similarly, if you want to disable execution of the ~/.bashrc file for an interactive shell, start bash
with the --norc
option. You can also force bash to use a file other than ~/.bashrc by specifying
the --rcfile
option with the name of the file you want to use. Listing 8 illustrates creation
of a simple file called testrc and its use with the --rcfile
option. Note that the VAR1
variable is not set in the outer shell, but has been set for the inner shell by the testrc file.
Listing 7. Using the --rcfile option
ian@attic4:~$ echo VAR1=var1>testrc ian@attic4:~$ echo $VAR1 ian@attic4:~$ bash --rcfile testrc ian@attic4:~$ echo $VAR1 var1
In addition to the standard ways of running bash from a terminal as outlined above, bash may also be used in other ways.
Unless you source a script to run in the current shell, it will run in its own non-interactiveshell, and the above profiles are not read. However, if the BASH_ENV variable is set, bash expands the value and assumes it is the name of a file. If the file exists, then bash executes the file before whatever script or command it is executing in the non-interactive shell. Listing 8 uses two simple files to illustrate this.
ian@attic4:~$ cat testenv.sh #!/bin/bash echo "Testing the environment" ian@attic4:~$ cat somescript.sh #!/bin/bash echo "Doing nothing" ian@attic4:~$ export BASH_ENV="~/testenv.sh" ian@attic4:~$ ./somescript.sh Testing the environment Doing nothing
Non-interactive shells may also be started with the --login
option to force execution
of the profile files.
Bash may also be started in POSIX mode using the --posix
option. This mode is
similar to the non-interactive shell, except that the file to execute is determined from the ENV environment
variable.
It is common in Linux systems to run bash as /bin/sh using a symbolic link. When bash detects that
it is being run under the name sh
, it attempts to follow the startup behavior of the older
Bourne shell while still conforming to POSIX standards. When run as a login shell, bash attempts to
read and execute /etc/profile and ~/.profile. When run as an interactive shell using the sh
command, bash attempts to execute the file specified by the ENV variable as it does when invoked in
POSIX mode. When run interactively as sh
, it only uses a file specified by the ENV
variable; the --rcfile
option will always be ignored.
If bash is invoked by the remote shell daemon, then it behaves as an interactive shell, using the ~/.bashrc file if it exists.
The Bash shell allows you to define aliases for commands. The most common reasons for aliases
are to provide an alternate name for a command, or to provide some default parameters for the command.
The vi editor has been a staple of UNIX and Linux systems for many years. The vim (Vi IMproved) editor
is like vi, but with many improvements. So if you are used to typing "vi" when you want an editor, but
you would really prefer to use vim, then an alias is for you. Listing 9 shows how to use the alias
command to accomplish this.
[ian@pinguino ~]$ alias vi='vim' [ian@pinguino ~]$ which vi alias vi='vim' /usr/bin/vim [ian@pinguino ~]$ /usr/bin/which vi /bin/vi
which
command to see where the vi program lives,
you get two lines of output: the first shows the alias, and the second the location of vim ( /usr/bin/vim).
However, if you use the which
command with its full path (/usr/bin/which
),
you get the location of the vi
command. If you guessed that this might mean that the
which
command itself is aliased on this system you would be right.You can also use the
alias
command to display all the aliases if you use it with no options or with just the
-p
option, and you can display the aliases for one or more names by giving the names as
arguments without assignments. Listing 10 shows the aliases for which
and vi
.
[ian@pinguino ~]$ alias which vi alias which='alias | /usr/bin/which --tty-only --read-alias --show-dot --show-tilde' alias vi='vim'
which
command is rather curious. Why pipe the output of the alias command
(with no arguments) to /usr/bin/which
? If you check the man pages for the which
command, you will find that the --read-alias
option instructs which
to read
a list of aliases from stdin and report matches on stdout. This allows the which
command
to report aliases as well as commands from your PATH, and is so common that your distribution may have
set it up as a default for you. This is a good thing to do since the shell will execute an alias before
a command of the same name. So now that you know this, you can check it using alias which
.
You can also learn whether this type of alias has been set for which
by running which
which
.Another common use for aliases is to add parameters automatically to commands, as you
saw above for the --read-alias
and several other parameters on the which
command.
This technique is often done for the root user with the cp
, mv
, and
rm
commands so that a prompt is issued before files are deleted or overwritten. This is illustrated
in Listing 11.
[root@pinguino ~]# alias cp mv rm alias cp='cp -i' alias mv='mv -i' alias rm='rm -i'
In the earlier tutorial "LPI
exam 101 prep (topic 103): GNU and UNIX commands," you learned about command sequences or
lists. You have just seen the pipe (|) operator used with an alias, and you can use command lists
as well. Suppose, for a simple example, that you want a command to list the contents of the current
directory and also the amount of space used by it and all its subdirectories. Let's call it the
lsdu
command. So you simply assign a sequence of the ls
and du
commands to the alias lsdu. Listing 12 shows the wrong way to do this and also the right way. Look carefully
at it before you read, and think about why the first attempt did not work.
[ian@pinguino developerworks]$ alias lsdu=ls;du -sh # Wrong way 2.9M . [ian@pinguino developerworks]$ lsdu a tutorial new-article.sh new-tutorial.sh readme tools xsl my-article new-article.vbs new-tutorial.vbs schema web [ian@pinguino developerworks]$ alias 'lsdu=ls;du -sh' # Right way way [ian@pinguino developerworks]$ lsdu a tutorial new-article.sh new-tutorial.sh readme tools xsl my-article new-article.vbs new-tutorial.vbs schema web 2.9M .
You need to be very careful to quote the full sequence that will make up the alias. You also need
to be very careful about whether you use double or single quotes if you have shell variables as part
of the alias. Do you want the shell to expand the variables when the alias is defined or when it is
executed? Listing 13 shows the wrong way to create a custom command called mywd
intended
to print your working directory name
[ian@pinguino developerworks]$ alias mywd="echo \"My working directory is $PWD\"" [ian@pinguino developerworks]$ mywd My working directory is /home/ian/developerworks [ian@pinguino developerworks]$ cd .. [ian@pinguino ~]$ mywd My working directory is /home/ian/developerworks
Remember that the double quotes cause bash to expand variables before executing a command. Listing
14 uses the alias
command to show what the resulting alias actually is, from which our
error is evident. Listing 14 also shows a correct way to define this alias.
[ian@pinguino developerworks]$ alias mywd alias mywd='echo \"My working directory is $PWD\"' [ian@pinguino developerworks]$ mywd "My working directory is /home/ian/developerworks" [ian@pinguino developerworks]$ cd .. [ian@pinguino ~]$ mywd "My working directory is /home/ian"
Success at last.
|
Aliases allow you to use an abbreviation or alternate name for a command or command list. You may
have noticed that you can add additional things, such as the program name you are seeking with the
which
command. When your input is executed, the alias is expanded, and anything else you
type after that is added to the expansion before the final command or list is executed. This means that
you can only add parameters to the end of the command or list, and you can use them only with the final
command. Functions provide additional capability, including the ability to process parameters. Functions
are part of the POSIX shell definition. They are available in shells such as bash, dash, and ksh, but
are not available in csh or tcsh.
In the next few paragraphs, you'll build a complex command piece-by-piece from smaller building blocks, refining it each step of the way and turning it into a function that you will further refine.
You can use the ls
command to list a variety of information about directories and files
in your file system. Suppose you would like a command, let's call it ldirs
, that will list
directory names with output like that in Listing 15.
[ian@pinguino developerworks]$ ldirs *[st]* tools/*a* my dw article schema tools tools/java xsl
To keep things relatively simple, the examples in this section use the directories and files from the developerWorks author package (see Resources), which you can use if you'd like to write articles or tutorials for developerWorks. In these examples, we used the new-article.sh script from the package to create a template for a new article that we've called "my dw article".
At the time of writing, the version of the developerWorks author package is 5.6, so you may see differences
if you use a later version. Or just use your own files and directories. The ldirs
command
will handle those too. You'll find additional bash function examples in the tools that come with the
developerWorks author package.
Ignoring the *[st]* tools/*a*
for the moment, if you use the ls
command
with the color options as shown in the aliases above, you will see output similar to that shown in Figure
1.
The directories are shown in dark blue in this example, but that's a bit hard to decode with the
skills you have developed in this series of tutorials. Using the -l
option, though, gives
a clue on how to proceed: directory listings have a 'd' in the first position. So your first step might
be to simply filter these from the long listing using grep
as shown in Listing 16.
[ian@pinguino developerworks]$ ls -l | grep "^d" drwxrwxr-x 2 ian ian 4096 Jan 24 17:06 my dw article drwxrwxr-x 2 ian ian 4096 Jan 18 16:23 readme drwxrwxr-x 3 ian ian 4096 Jan 19 07:41 schema drwxrwxr-x 3 ian ian 4096 Jan 19 15:08 tools drwxrwxr-x 3 ian ian 4096 Jan 17 16:03 web drwxrwxr-x 3 ian ian 4096 Jan 19 10:59 xsl
You might consider using awk
instead of grep
so that in one pass you can filter the list and strip off the last part
of each line, which is the directory name, as shown in Listing 17.
[ian@pinguino developerworks]$ ls -l | awk '/^d/ { print $NF } ' article readme schema tools web xsl
The problem with the approach in Listing 17 is that it doesn't handle the directory with spaces in
the name, such as "my dw article". As with most things in Linux and life, there are often several ways
to solve a problem, but the objective here is to learn about functions, so let's return to using
grep
. Another tool you learned about earlier in this series is cut
, which
cuts fields out of a file, including stdin. Looking back at Listing 16 again, you see eight blank-delimited
fields before the filename. Adding cut
to the previous command gives you output as shown
in Listing 18. Note that the -f9-
option tells cut
to print fields 9 and above.
[ian@pinguino developerworks]$ ls -l | grep "^d" | cut -d" " -f9- my dw article readme schema tools web xsl
A small problem with our approach is made obvious if we try our command on the tools directory instead of on the current directory as shown in Listing 19.
[ian@pinguino developerworks]$ ls -l tools | grep "^d" | cut -d" " -f9- 11:25 java [ian@pinguino developerworks]$ ls -ld tools/[fjt]* -rw-rw-r-- 1 ian ian 4798 Jan 8 14:38 tools/figure1.gif drwxrwxr-x 2 ian ian 4096 Oct 31 11:25 tools/java -rw-rw-r-- 1 ian ian 39431 Jan 18 23:31 tools/template-dw-article-5.6.xml -rw-rw-r-- 1 ian ian 39407 Jan 18 23:32 tools/template-dw-tutorial-5.6.xml
cut
interpreted the extra space as another field separator.The cut
command can also cut using character positions instead of fields. Rather than
counting characters, the Bash shell has lots of utilities that you can use, so you might try using the
seq
and printf
commands to print a ruler above your long directory listing
so you can easily figure where to cut the lines of output. The seq
command takes up to
three arguments, which allow you to print all the numbers up to a given value, all the numbers from
one value to another, or all the numbers from one value, stepping by a given value, up to a third value.
See the man pages for all the other fancy things you can do with seq
, including printing
octal or hexadecimal numbers. For now let's use seq
and printf
to print a
ruler with positions marked every 10 characters as shown in Listing 20.
[ian@pinguino developerworks]$ printf "....+...%2.d" `seq 10 10 60`;printf "\n";ls -l ....+...10....+...20....+...30....+...40....+...50....+...60 total 88 drwxrwxr-x 2 ian ian 4096 Jan 24 17:06 my dw article -rwxr--r-- 1 ian ian 215 Sep 27 16:34 new-article.sh -rwxr--r-- 1 ian ian 1078 Sep 27 16:34 new-article.vbs -rwxr--r-- 1 ian ian 216 Sep 27 16:34 new-tutorial.sh -rwxr--r-- 1 ian ian 1079 Sep 27 16:34 new-tutorial.vbs drwxrwxr-x 2 ian ian 4096 Jan 18 16:23 readme drwxrwxr-x 3 ian ian 4096 Jan 19 07:41 schema drwxrwxr-x 3 ian ian 4096 Jan 19 15:08 tools drwxrwxr-x 3 ian ian 4096 Jan 17 16:03 web drwxrwxr-x 3 ian ian 4096 Jan 19 10:59 xsl
ls -l | grep "^d" | cut -c40-
to cut lines starting at
position 40. A moment's reflection reveals that this doesn't really solve the problem either, because
larger files will move the correct cut position to the right. Try it for yourself.
Sometimes called the "Swiss army knife" of the UNIX and Linux toolbox, sed is an extremely powerful
editing filter that uses regular expressions. You now understand that the challenge is to strip off
the first 8 words and the blanks that follow them from every line of output that begins with 'd'. You
can do it all with sed
: select only those lines you are interested in using the pattern-matching
expression /^d/
, substituting a null string for the first eight words using the substitute
command s/^d\([^ ]* *\)\(8\}//
. Use the -n
option to print only lines that
you specify with the p
command as shown in Listing 21.
[ian@pinguino developerworks]$ ls -l | sed -ne 's/^d\([^ ]* *\)\{8\}//p' my dw article readme schema tools web xsl [ian@pinguino developerworks]$ ls -l tools | sed -ne 's/^d\([^ ]* *\)\{8\}//p' java
Now that you have the complex command that you want for your ldirs
function, it's time
to learn about making it a function. A function consists of a name followed by () and then a compound
command. For now, a compound command will be any command or command list, terminated by a semicolon
and surrounded by braces (which must be separated from other tokens by white space). You will learn
about other compound commands in the
Shell scripts section.
Note: In the Bash shell, a function name may be preceded by the word 'function', but this is not part of the POSIX specification and is not supported by more minimalist shells such as dash. In the Shell scripts section, you will learn how to make sure that a script is interpreted by a particular shell, even if you normally use a different shell.
Inside the function, you can refer to the parameters using the bash special variables in Table 4. You prefix these with a $ symbol to reference them as with other shell variables.
Parameter | Purpose |
---|---|
0, 1, 2, ... | The positional parameters starting from parameter 0. Parameter 0 refers to the name of the
program that started bash, or the name of the shell script if the function is running within
a shell script. See the bash man pages for information on other possibilities, such as when
bash is started with the -c parameter. A string enclosed in single or double quotes
will be passed as a single parameter, and the quotes will be stripped. In the case of double
quotes, any shell variables such as $HOME will be expanded before the function is called. You
will need to use single or double quotes to pass parameters that contain embedded blanks or
other characters that might have special meaning to the shell. |
* | The positional parameters starting from parameter 1. If the expansion is done within double quotes, then the expansion is a single word with the first character of the interfield separator (IFS) special variable separating the parameters or no intervening space if IFS is null. The default IFS value is a blank, tab, and newline. If IFS is unset, then the separator used is a blank, just as for the default IFS. |
@ | The positional parameters starting from parameter 1. If the expansion is done within double quotes, then each parameter becomes a single word, so that "$@" is equivalent to "$1" "$2" .... If your parameters are likely to contain embedded blanks, you will want to use this form. |
# | The number of parameters, not including parameter 0. |
Note: If you have more than 9 parameters, you cannot use $10 to refer to the tenth one. You
must first either process or save the first parameter ($1), then use the shift
command
to drop parameter 1 and move all remaining parameters down 1, so that $10 becomes $9 and so on. The
value of $# will be updated to reflect the remaining number of parameters.
Now you can define a simple function to do nothing more than tell you how many parameters it has and display them as shown in Listing 22.
[ian@pinguino developerworks]$ testfunc () { echo "$# parameters"; echo "$@"; } [ian@pinguino developerworks]$ testfunc 0 parameters [ian@pinguino developerworks]$ testfunc a b c 3 parameters a b c [ian@pinguino developerworks]$ testfunc a "b c" 2 parameters a b c
Now
take the complex command that we tested up to this point and create a ldirs
function with
it, using "$@" to represent the parameters. You can enter all of the function on a single line as you
did in the previous example, or bash lets you enter commands on multiple lines, in which case a semicolon
will be added automatically as shown in Listing 23. Listing 23 also shows the use of the type
command to display the function definition. Note from the output of type
that the
ls
command has been replaced by the expanded value of its alias. You could use /bin/ls
instead of plain ls
if you needed to avoid this.
[ian@pinguino developerworks]$ # Enter the function on a single line [ian@pinguino developerworks]$ ldirs () { ls -l "$@"|sed -ne 's/^d\([^ ]* *\)\{8\}//p'; } [ian@pinguino developerworks]$ # Enter the function on multiple lines [ian@pinguino developerworks]$ ldirs () > { > ls -l "$@"|sed -ne 's/^d\([^ ]* *\)\{8\}//p' > } [ian@pinguino developerworks]$ type ldirs ldirs is a function ldirs () { ls --color=tty -l "$@" | sed -ne 's/^d\([^ ]* *\)\{8\}//p' } [ian@pinguino developerworks]$ ldirs my dw article readme schema tools web xsl [ian@pinguino developerworks]$ ldirs tools java
ldirs *
as shown
in Listing 24?
[ian@pinguino developerworks]$ ldirs * 5.6 java www.ibm.com 5.6
Are you surprised? You didn't find directories in the current directory, but rather second-level
subdirectories. Review the man page for the ls
command or our earlier tutorials in this
series to understand why. Or run the find
command as shown in Listing 25 to print the names
of second-level subdirectories.
[ian@pinguino developerworks]$ find . -mindepth 2 -maxdepth 2 -type d ./tools/java ./web/www.ibm.com ./xsl/5.6 ./schema/5.6
Using wildcards has exposed a problem with the logic in this approach. We blithely ignored the fact
that ldirs
without any parameters displayed the subdirectories in the current directory,
while ldirs tools
displayed the java subdirectory of the tools directory rather than the
tools directory itself as you would expect using ls
with files rather than directories.
Ideally, you should use ls -l
if no parameters are given and ls -ld
if some
parameters are given. You can use the test
command to test the number of parameters and
then use &&
and ||
to build a command list that executes the appropriate command.
Using the [ test expression ]
form of test
, your expression might
look like { [ $# -gt 0 ] &&/bin/ls -ld "$@" || /bin/ls -l } | sed -ne ...
.
There is a small issue with this code, though, in that if the ls -ld
command doesn't
find any matching files or directories, it will issue an error message and return with a non-zero exit
code, thus causing the ls -l
command to be executed as well. Perhaps not what you wanted.
One answer is to construct a compound command for the first ls
command so that the number
of parameters is tested again if the command fails. Expand the function to include this, and your function
should now appear as in Listing 26. Try using it with some of the parameters in Listing 26, or experiment
with your own parameters to see how it behaves.
[ian@pinguino ~]$ type ldirs ldirs is a function ldirs () { { [ $# -gt 0 ] && { /bin/ls -ld "$@" || [ $# -gt 0 ] } || /bin/ls -l } | sed -ne 's/^d\([^ ]* *\)\{8\}//p' } [ian@pinguino developerworks]$ ldirs * my dw article readme schema tools web xsl [ian@pinguino developerworks]$ ldirs tools/* tools/java [ian@pinguino developerworks]$ ldirs *xxx* /bin/ls: *xxx*: No such file or directory [ian@pinguino developerworks]$ ldirs *a* *s* my dw article readme schema schema tools xsl
At this point you might get a directory listed twice as
in the last example of Listing 26. You could extend the pipeline by piping the sed
output
through sort | uniq
if you wish.
Starting from some small building blocks, you have now built quite a complex shell function.
The keystrokes you type at a terminal session, and also those used in programs such as FTP, are processed
by the readline library and can be configured. By default, the customization file is .inputrc in your
home directory, which will be read during bash startup if it exists. You can configure a different file
by setting the INPUTRC variable. If it is not set, .inputrc in your home directory will be used. Many
systems have a default key mapping in /etc/inputrc, so you will normally want to include these using
the $include
directive.
Listing 27 illustrates how you might bind your ldirs
function to the Ctrl-t key combination
(press and hold Ctrl, then press t). If you want the command to be executed with no parameters, add
\n to the end of the configuration line.
# My custom key mappings $include /etc/inputrc
The INPUTRC file can include conditional specifications. For example, the behavior of your keyboard should be different according to whether you are using emacs editing mode (the bash default) or vi mode. See the man pages for bash for more details on how to customize your keyboard.
You will probably add your aliases and functions to your ~/.bashrc file, although you may save them
in any file you like. Whichever you do, remember to source the file or files using the source
or .
command so that the contents of your file will be read and executed in the current
environment. If you create a script and just execute it, it will be executed in a subshell and all your
valuable customization will be lost when the subshell exits and returns control to you.
In the next section, you learn how to go beyond simple functions. You learn how to add programming constructs such as conditional tests and looping constructs and combine these with multiple functions to create or modify Bash shell scripts.
|
Switchboard | ||||
Latest | |||||
Past week | |||||
Past month |
Do you sometimes wonder how to use parameters with your scripts, and how to pass them to internal functions or other scripts? Do you need to do simple validity tests on parameters or options, or perform simple extraction and replacement operations on the parameter strings? This tip helps you with parameter use and the various parameter expansions available in the bash shell.
Google matched content |
LPI exam 102 prep, Topic 109 Shells, scripting, programming, and compiling by Ian Shields, 30 January 2007
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|
You can use PayPal to to buy a cup of coffee for authors of this site |
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Last modified: July, 05, 2017