|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
Nobody really knows what the Bourne shell's grammar is. Even examination of the source code is little help. |
Tom Duff |
|
Please visit Tips & Tricks section of Sheldorado as well as SHELLdorado - Newsletter Archive which contain a lot of useful tips.
Most shell scripts are quick 'n dirty solutions to non-complex problems. Therefore the first and the most important tip I can give is "not too much zeal in optimization". Optimizing scripts for speed usually is a bad idea. In case that script performs an important task, but runs too slowly convert it to a scripting language. For example Perl. That is especially prudent if the script has nested loops. Time consumed by repetitive operations adds up quickly. Use the time and times tools to profile computation-intensive commands.
|
Bash is not particularly efficient at handling files, so consider using more appropriate tools for this within the script, such as awk or Perl. Unless you know, or willing to learn Perl, Awk is much underappreciated utility that can and should be used more widely in shell scripts. That's probably the second most important tip I can give.
Try to write your scripts in a structured, coherent form, so they can be reorganized and reused as necessary. Borrow from somebody and use a standard header that explains that purpose of the script and document changes that you made. Above all, use common sense.
The problem with bash is that it pretty baroque and there are just too many features that you need to remember. This command, that command and so on till infinity. It is difficult to remember details of each bash has an on-line help feature that provides basic information about most of its built-in commands. To see the help description for a particular command, enter
help command
(for example, help alias) at the bash UNIX prompt. To see a list of bash commands for which help is available, type help at the bash UNIX prompt. You may access the manual page for bash by entering man bash at a UNIX prompt, but beware, it is 60 pages long and not very readable.
With bash 3.x, you can reissue commands like in C-shell using arrow keys and use Ctrl-r to browse command history. (In emacs mode you can also use CTRL-p and CTRL-n )
Bash also supports "file name completion" which, if not abused, can save some typing.
Like any decent shell Bash also allows you to define aliases and you should avoid retying the same command twice not only by browsing the history but defining aliases from history. But remember that too many aliases are counterproductive. Limit you repertoire to a dozen. For such thing like browsing /var/log/messages or /var/adm/messages it's better to define functions that are more powerful tool the aliases.
Use a separate dot file with functions and aliases for example an .aliases file, and .bashrc
Please visit Heiner Steven's SHELLdorado, the best shell scripting site on the Internet |
June 24, 2015 | cyberciti.biz
... ... ...
Bash v4.0+ has inbuilt support for setting up a step value using {START..END..INCREMENT} syntax:
#!/bin/bash echo "Bash version ${BASH_VERSION}..." for i in {0..10..2} do echo "Welcome $i times" doneSample outputs:
Bash version 4.0.33(0)-release... Welcome 0 times Welcome 2 times Welcome 4 times Welcome 6 times Welcome 8 times Welcome 10 times... ... ...
Three-expression bash for loops syntax
This type of for loop share a common heritage with the C programming language. It is characterized by a three-parameter loop control expression; consisting of an initializer (EXP1), a loop-test or condition (EXP2), and a counting expression (EXP3).
for (( EXP1; EXP2; EXP3 )) do command1 command2 command3 doneA representative three-expression example in bash as follows:
#!/bin/bash for (( c=1; c<=5; c++ )) do echo "Welcome $c times" done... ... ...Jadu Saikia, November 2, 2008, 3:37 pm
Nice one. All the examples are explained well, thanks Vivek.Andi Reinbrech, November 18, 2010, 7:42 pmseq 1 2 20
output can also be produced using jotjot – 1 20 2
The infinite loops as everyone knows have the following alternatives.
while(true)
or
while ://Jadu
I know this is an ancient thread, but thought this trick might be helpful to someone:Peko, July 16, 2009, 6:11 pmFor the above example with all the cuts, simply do
set `echo $line`
This will split line into positional parameters and you can after the set simply say
F1=$1; F2=$2; F3=$3
I used this a lot many years ago on solaris with "set `date`", it neatly splits the whole date string into variables and saves lots of messy cutting :-)
… no, you can't change the FS, if it's not space, you can't use this method
Hi Vivek,Michal Kaut July 22, 2009, 6:12 am
Thanks for this a useful topic.IMNSHO, there may be something to modify here
=======================
Latest bash version 3.0+ has inbuilt support for setting up a step value:#!/bin/bash
for i in {1..5}
=======================
1) The increment feature seems to belong to the version 4 of bash.
Reference: http://bash-hackers.org/wiki/doku.php/syntax/expansion/brace
Accordingly, my bash v3.2 does not include this feature.BTW, where did you read that it was 3.0+ ?
(I ask because you may know some good website of interest on the subject).2) The syntax is {from..to..step} where from, to, step are 3 integers.
You code is missing the increment.Note that GNU Bash documentation may be bugged at this time,
because on GNU Bash manual, you will find the syntax {x..y[incr]}
which may be a typo. (missing the second ".." between y and increment).see http://www.gnu.org/software/bash/manual/bashref.html#Brace-Expansion
The Bash Hackers page
again, see http://bash-hackers.org/wiki/doku.php/syntax/expansion/brace
seeems to be more accurate,
but who knows ? Anyway, at least one of them may be right… ;-)Keep on the good work of your own,
Thanks a million.- Peko
Hello,is there a simple way to control the number formatting? I use several computers, some of which have non-US settings with comma as a decimal point. This means that
for x in $(seq 0 0.1 1)
gives 0 0.1 0.2 … 1 one some machines and 0 0,1 0,2 … 1 on other.
Is there a way to force the first variant, regardless of the language settings? Can I, for example, set the keyboard to US inside the script? Or perhaps some alternative to$x
that would convert commas to points?
(I am sending these as parameters to another code and it won't accept numbers with commas…)The best thing I could think of is adding
x=`echo $x | sed s/,/./`
as a first line inside the loop, but there should be a better solution? (Interestingly, the sed command does not seem to be upset by me rewriting its variable.)Thanks,
MichalPeko July 22, 2009, 7:27 am
To Michal Kaut:
Hi Michal,
Such output format is configured through LOCALE settings.
I tried :
export LC_CTYPE="en_EN.UTF-8″; seq 0 0.1 1
and it works as desired.
You just have to find the exact value for LC_CTYPE that fits to your systems and your needs.
Peko
Peko July 22, 2009, 2:29 pm
To Michal Kaus [2]
Ooops – ;-)
Instead of LC_CTYPE,
LC_NUMERIC should be more appropriate
(Although LC_CTYPE is actually yielding to the same result – I tested both)By the way, Vivek has already documented the matter : http://www.cyberciti.biz/tips/linux-find-supportable-character-sets.html
Philippe Petrinko October 30, 2009, 8:35 am
To Vivek:
Regarding your last example, that is : running a loop through arguments given to the script on the command line, there is a simplier way of doing this:
# instead of:
# FILES="$@"
# for f in $FILES# use the following syntax
for arg
do
# whatever you need here – try : echo "$arg"
doneOf course, you can use any variable name, not only "arg".
Philippe Petrinko November 11, 2009, 11:25 am
To tdurden:
Why would'nt you use
1) either a [for] loop
for old in * ; do mv ${old} ${old}.new; done2) Either the [rename] command ?
excerpt form "man rename" :RENAME(1) Perl Programmers Reference Guide RENAME(1)
NAME
rename – renames multiple filesSYNOPSIS
rename [ -v ] [ -n ] [ -f ] perlexpr [ files ]DESCRIPTION
"rename" renames the filenames supplied according to the rule specified
as the first argument. The perlexpr argument is a Perl expression
which is expected to modify the $_ string in Perl for at least some of
the filenames specified. If a given filename is not modified by the
expression, it will not be renamed. If no filenames are given on the
command line, filenames will be read via standard input.For example, to rename all files matching "*.bak" to strip the
extension, you might sayrename 's/\.bak$//' *.bak
To translate uppercase names to lower, you'd use
rename 'y/A-Z/a-z/' *
- Philippe
Philippe Petrinko November 11, 2009, 9:27 pm
If you set the shell option extglob, Bash understands some more powerful patterns. Here, a is one or more pattern, separated by the pipe-symbol (|).
?() Matches zero or one occurrence of the given patterns
*() Matches zero or more occurrences of the given patterns
+() Matches one or more occurrences of the given patterns
@() Matches one of the given patterns
!() Matches anything except one of the given patternssource: http://www.bash-hackers.org/wiki/doku.php/syntax/pattern
Philippe Petrinko November 12, 2009, 3:44 pm
To Sean:
Right, the more sharp a knife is, the easier it can cut your fingers…I mean: There are side-effects to the use of file globbing (like in [ for f in * ] ) , when the globbing expression matches nothing: the globbing expression is not susbtitued.
Then you might want to consider using [ nullglob ] shell extension,
to prevent this.
see: http://www.bash-hackers.org/wiki/doku.php/syntax/expansion/globs#customizationDevil hides in detail ;-)
Dominic January 14, 2010, 10:04 am
There is an interesting difference between the exit value for two different for looping structures (hope this comes out right):
for (( c=1; c<=2; c++ )) do echo -n "inside (( )) loop c is $c, "; done; echo "done (( )) loop c is $c"
for c in {1..2}; do echo -n "inside { } loop c is $c, "; done; echo "done { } loop c is $c"
You see that the first structure does a final increment of c, the second does not. The first is more useful IMO because if you have a conditional break in the for loop, then you can subsequently test the value of $c to see if the for loop was broken or not; with the second structure you can't know whether the loop was broken on the last iteration or continued to completion.Dominic January 14, 2010, 10:09 am
sorry, my previous post would have been clearer if I had shown the output of my code snippet, which is:
inside (( )) loop c is 1, inside (( )) loop c is 2, done (( )) loop c is 3
inside { } loop c is 1, inside { } loop c is 2, done { } loop c is 2Philippe Petrinko March 9, 2010, 2:34 pm
@Dmitry
And, again, as stated many times up there, using [seq] is counter productive, because it requires a call to an external program, when you should Keep It Short and Simple, using only bash internals functions:
for ((c=1; c<21; c+=2)); do echo "Welcome $c times" ; done
(and I wonder why Vivek is sticking to that old solution which should be presented only for historical reasons when there was no way of using bash internals.
By the way, this historical recall should be placed only at topic end, and not on top of the topic, which makes newbies sticking to the not-up-to-date technique ;-) )Sean March 9, 2010, 11:15 pm
Andi Reinbrech November 18, 2010, 8:35 pmI have a comment to add about using the builtin for (( … )) syntax. I would agree the builtin method is cleaner, but from what I've noticed with other builtin functionality, I had to check the speed advantage for myself. I wrote the following files:
builtin_count.sh:
#!/bin/bash
for ((i=1;i<=1000000;i++))
do
echo "Output $i"
done
seq_count.sh:
#!/bin/bash
for i in $(seq 1 1000000)
do
echo "Output $i"
done
And here were the results that I got:
time ./builtin_count.sh
real 0m22.122s
user 0m18.329s
sys 0m3.166stime ./seq_count.sh
real 0m19.590s
user 0m15.326s
sys 0m2.503sThe performance increase isn't too significant, especially when you are probably going to be doing something a little more interesting inside of the for loop, but it does show that builtin commands are not necessarily faster.
The reason why the external seq is faster, is because it is executed only once, and returns a huge splurb of space separated integers which need no further processing, apart from the for loop advancing to the next one for the variable substitution.
The internal loop is a nice and clean/readable construct, but it has a lot of overhead. The check expression is re-evaluated on every iteration, and a variable on the interpreter's heap gets incremented, possibly checked for overflow etc. etc.
Note that the check expression cannot be simplified or internally optimised by the interpreter because the value may change inside the loop's body (yes, there are cases where you'd want to do this, however rare and stupid they may seem), hence the variables are volatile and get re-evaluted.
I.e. botom line, the internal one has more overhead, the "seq" version is equivalent to either having 1000000 integers inside the script (hard coded), or reading once from a text file with 1000000 integers with a cat. Point being that it gets executed only once and becomes static.
OK, blah blah fishpaste, past my bed time :-)
Cheers,
AndiAnthony Thyssen June 4, 2010, 6:53 am
TheBonsai June 4, 2010, 9:57 amThe {1..10} syntax is pretty useful as you can use a variable with it!
limit=10 echo {1..${limit}} {1..10}You need to eval it to get it to work!
limit=10 eval "echo {1..${limit}}" 1 2 3 4 5 6 7 8 9 10'seq' is not avilable on ALL system (MacOSX for example)
and BASH is not available on all systems either.You are better off either using the old while-expr method for computer compatiblity!
limit=10; n=1; while [ $n -le 10 ]; do echo $n; n=`expr $n + 1`; doneAlternativally use a seq() function replacement…
# seq_count 10 seq_count() { i=1; while [ $i -le $1 ]; do echo $i; i=`expr $i + 1`; done } # simple_seq 1 2 10 simple_seq() { i=$1; while [ $i -le $3 ]; do echo $i; i=`expr $i + $2`; done } seq_integer() { if [ "X$1" = "X-f" ] then format="$2"; shift; shift else format="%d" fi case $# in 1) i=1 inc=1 end=$1 ;; 2) i=$1 inc=1 end=$2 ;; *) i=$1 inc=$2 end=$3 ;; esac while [ $i -le $end ]; do printf "$format\n" $i; i=`expr $i + $inc`; done }Edited: by Admin – added code tags.
The Bash C-style for loop was taken from KSH93, thus I guess it's at least portable towards Korn and Z.
The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX.
Philippe Petrinko June 4, 2010, 10:15 am
Right Bonsai,
( http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html#tag_02_06_04 )But FOR C-style does not seem to be POSIXLY-correct…
Read on-line reference issue 6/2004,
Top is here, http://www.opengroup.org/onlinepubs/009695399/mindex.htmland the Shell and Utilities volume (XCU) T.OC. is here
http://www.opengroup.org/onlinepubs/009695399/utilities/toc.html
doc is:
http://www.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap01.htmland FOR command:
http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html#tag_02_09_04_03Anthony Thyssen June 6, 2010, 7:18 am
TheBonsai wrote…. "The seq-function above could use i=$((i + inc)), if only POSIX matters. expr is obsolete for those things, even in POSIX."
I am not certain it is in Posix. It was NOT part of the original Bourne Shell, and on some machines, I deal with Bourne Shell. Not Ksh, Bash, or anything else.
Bourne Shell syntax works everywhere! But as 'expr' is a builtin in more modern shells, then it is not a big loss or slow down.
This is especially important if writing a replacement command, such as for "seq" where you want your "just-paste-it-in" function to work as widely as possible.
I have been shell programming pretty well all the time since 1988, so I know what I am talking about! Believe me.
MacOSX has in this regard been the worse, and a very big backward step in UNIX compatibility. 2 year after it came out, its shell still did not even understand most of the normal 'test' functions. A major pain to write shells scripts that need to also work on this system.
TheBonsai June 6, 2010, 12:35 pm
Philippe Petrinko November 22, 2010, 8:23 amYea, the question was if it's POSIX, not if it's 100% portable (which is a difference). The POSIX base more or less is a subset of the Korn features (88, 93), pure Bourne is something "else", I know. Real portability, which means a program can go wherever UNIX went, only in C ;)
And if you want to get rid of double-quotes, use:
one-liner code:
while read; do record=${REPLY}; echo ${record}|while read -d ","; do field="${REPLY#\"}"; field="${field%\"}"; echo ${field}; done; done<data
script code, added of some text to better see record and field breakdown:
#!/bin/bash
while read
do
echo "New record"
record=${REPLY}
echo ${record}|while read -d ,
do
field="${REPLY#\"}"
field="${field%\"}"
echo "Field is :${field}:"
done
done<data
Does it work with your data?
- PP
Philippe Petrinko November 22, 2010, 9:01 am
Of course, all the above code was assuming that your CSV file is named "data".
If you want to use anyname with the script, replace:
done<data
With:
done
And then use your script file (named for instance "myScript") with standard input redirection:
myScript < anyFileNameYouWant
Enjoy!
Philippe Petrinko November 22, 2010, 11:28 am
well no there is a bug, last field of each record is not read – it needs a workout and may be IFS modification ! After all that's what it was built for… :O)
Anthony Thyssen November 22, 2010, 11:31 pm
Another bug is the inner loop is a pipeline, so you can't assign variables for use later in the script. but you can use '<<<' to break the pipeline and avoid the echo.
But this does not help when you have commas within the quotes! Which is why you needed quotes in the first place.
In any case It is a little off topic. Perhaps a new thread for reading CVS files in shell should be created.
Philippe Petrinko November 24, 2010, 6:29 pm
Anthony,
Would you try this one-liner script on your CSV file?This one-liner assumes that CSV file named [data] has __every__ field double-quoted.
while read; do r="${REPLY#\"}";echo "${r//\",\"/\"}"|while read -d \";do echo "Field is :${REPLY}:";done;done<data
Here is the same code, but for a script file, not a one-liner tweak.
#!/bin/bash
# script csv01.sh
#
# 1) Usage
# This script reads from standard input
# any CSV with double-quoted data fields
# and breaks down each field on standard output
#
# 2) Within each record (line), _every_ field MUST:
# - Be surrounded by double quotes,
# - and be separated from preceeding field by a comma
# (not the first field of course, no comma before the first field)
#
while read
do
echo "New record" # this is not mandatory-just for explanation
#
#
# store REPLY and remove opening double quote
record="${REPLY#\"}"
#
#
# replace every "," by a single double quote
record=${record//\",\"/\"}
#
#
echo ${record}|while read -d \"
do
# store REPLY into variable "field"
field="${REPLY}"
#
#
echo "Field is :${field}:" # just for explanation
done
done
This script named here [cvs01.sh] must be used so:
cvs01.sh < my-cvs-file-with-doublequotes
Philippe Petrinko November 24, 2010, 6:35 pm
TheBonsai March 8, 2011, 6:26 am@Anthony,
By the way, using [REPLY] in the outer loop _and_ the inner loop is not a bug.
As long as you know what you do, this is not problem, you just have to store [REPLY] value conveniently, as this script shows.for ((i=1; i<=20; i++)); do printf "%02d\n" "$i"; donenixCraft March 8, 2011, 6:37 am
+1 for printf due to portability, but you can use bashy .. syntax too
for i in {01..20}; do echo "$i"; doneTheBonsai March 8, 2011, 6:48 am
Well, it isn't portable per se, it makes it portable to pre-4 Bash versions.
I think a more or less "portable" (in terms of POSIX, at least) code would be
i=0 while [ "$((i >= 20))" -eq 0 ]; do printf "%02d\n" "$i" i=$((i+1)) donePhilip Ratzsch April 20, 2011, 5:53 am
I didn't see this in the article or any of the comments so I thought I'd share. While this is a contrived example, I find that nesting two groups can help squeeze a two-liner (once for each range) into a one-liner:
for num in {{1..10},{15..20}};do echo $num;done
Great reference article!
Philippe Petrinko April 20, 2011, 8:23 am
@Philip
Philippe Petrinko May 6, 2011, 10:13 am
Nice thing to think of, using brace nesting, thanks for sharing.Hello Sanya,
That would be because brace expansion does not support variables. I have to check this.
Anyway, Keep It Short and Simple: (KISS) here is a simple solution I already gave above:xstart=1;xend=10;xstep=1
for (( x = $xstart; x <= $xend; x += $xstep)); do echo $x;doneActually, POSIX compliance allows to forget $ in for quotes, as said before, you could also write:
xstart=1;xend=10;xstep=1
for (( x = xstart; x <= xend; x += xstep)); do echo $x;donePhilippe Petrinko May 6, 2011, 10:48 am
Sanya,
Actually brace expansion happens __before__ $ parameter exapansion, so you cannot use it this way.
Nevertheless, you could overcome this this way:
max=10; for i in $(eval echo {1..$max}); do echo $i; done
Sanya May 6, 2011, 11:42 am
Hello, Philippe
Thanks for your suggestions
You basically confirmed my findings, that bash constructions are not as simple as zsh ones.
But since I don't care about POSIX compliance, and want to keep my scripts "readable" for less experienced people, I would prefer to stick to zsh where my simple for-loop worksCheers, Sanya
Philippe Petrinko May 6, 2011, 12:07 pm
Sanya,
First, you got it wrong: solutions I gave are not related to POSIX, I just pointed out that POSIX allows not to use $ in for (( )), which is just a little bit more readable – sort of.
Second, why do you see this less readable than your [zsh] [for loop]?
for (( x = start; x <= end; x += step)) do
echo "Loop number ${x}"
doneIt is clear that it is a loop, loop increments and limits are clear.
IMNSHO, if anyone cannot read this right, he should not be allowed to code. :-D
BFN
Anthony Thyssen May 8, 2011, 11:30 pm
If you are going to do… $(eval echo {1..$max});
You may as well use "seq" or one of the many other forms.
See all the other comments on doing for loops.Tom P May 19, 2011, 12:16 pm
I am trying to use the variable I set in the for line on to set another variable with a different extension. Couldn't get this to work and couldnt find it anywhere on the web… Can someone help.
Example:
FILE_TOKEN=`cat /tmp/All_Tokens.txt` for token in $FILE_TOKEN do A1_$token=`grep $A1_token /file/path/file.txt | cut -d ":" -f2`my goal is to take the values from the ALL Tokens file and set a new variable with A1_ infront of it… This tells be that A1_ is not a command…
This section describes another of the more unusual commands in the shell: eval. Its format is as follows:
eval command-linewhere command-line is a normal command line that you would type at the terminal. When you put eval in front of it, however, the net effect is that the shell scans the command line twice before executing it.[1] For the simple case, this really has no effect:
[1] Actually, what happens is that eval simply executes the command passed to it as arguments; so the shell processes the command line when passing the arguments to eval, and then once again when eval executes the command. The net result is that the command line is scanned twice by the shell.
$ eval echo hello hello $But consider the following example without the use of eval:
$ pipe="|" $ ls $pipe wc -l |: No such file or directory wc: No such file or directory -l: No such file or directory $Those errors come from ls. The shell takes care of pipes and I/O redirection before variable substitution, so it never recognizes the pipe symbol inside pipe. The result is that the three arguments |, wc, and -l are passed to ls as arguments.
Putting eval in front of the command sequence gives the desired results:
$ eval ls $pipe wc –l 16 $The first time the shell scans the command line, it substitutes | as the value of pipe. Then eval causes it to rescan the line, at which point the | is recognized by the shell as the pipe symbol.
The eval command is frequently used in shell programs that build up command lines inside one or more variables. If the variables contain any characters that must be seen by the shell directly on the command line (that is, not as the result of substitution), eval can be useful. Command terminator (;, |, &), I/O redirection (<, >), and quote characters are among the characters that must appear directly on the command line to have any special meaning to the shell.
For the next example, consider writing a program last whose sole purpose is to display the last argument passed to it. You needed to get at the last argument in the mycp program in Chapter 10, "Reading and Printing Data." There you did so by shifting all the arguments until the last one was left. You can also use eval to get at it as shown:
$ cat last eval echo \$$# $ last one two three four four $ last * Get the last file zoo_report $The first time the shell scans
echo \$$#the backslash tells it to ignore the $ that immediately follows. After that, it encounters the special parameter $#, so it substitutes its value on the command line. The command now looks like this:
echo $4(the backslash is removed by the shell after the first scan). When the shell rescans this line, it substitutes the value of $4 and then executes echo.
This same technique could be used if you had a variable called arg that contained a digit, for example, and you wanted to display the positional parameter referenced by arg. You could simply write
eval echo \$$argThe only problem is that just the first nine positional parameters can be accessed this way; to access positional parameters 10 and greater, you must use the ${n} construct:
eval echo \${$arg}Here's how the eval command can be used to effectively create "pointers" to variables:
$ x=100 $ ptrx=x $ eval echo \$$ptrx Dereference ptrx 100 $ eval $ptrx=50 Store 50 in var that ptrx points to $ echo $x See what happened 50 $
A common eval use is to build a dynamic string containing valid Unix commands and then use eval to execute the string. Why do we need eval? Often, you can build a command that doesn't require eval:evalstr="myexecutable" $evalstr # execute the command stringHowever, chances are the above command won't work if "myexecutable" requires command-line arguments. That's where eval comes in.Our man page says that the arguments to the eval command are "read as input to the shell and the resulting commands executed". What does that mean? Think of it as the eval command forcing a second pass so the string's arguments become the arguments of the spawned child shell.
In a previous column, we built a dynamic sed command that skipped 3 header lines, printed 5 lines, and skipped 3 more lines until the end of the file:
evalstr="sed -n '4,\${p;n;p;n;p;n;p;n;p;n;n;n;}' data.file" eval $evalstr # execute the command stringThis command fails without eval. When the sed command executes in the child shell, eval forces the remainder of the string to become arguments to the child.Possibly the coolest eval use is building dynamic Unix shell variables. The following stub script dynamically creates shell variables user1 and user2 setting them equal to the strings John and Ed, respectively:
COUNT=1 eval user${COUNT}=John echo $user1 COUNT=2 eval user${COUNT}=Ed echo $user2 Pasting Files with pasteAnother novice asked how to line up three files line by line sending the output to another file. Given the following:file1: 1 2 3 file2: a b c file3: 7 8 9the output file should look like this:1a7 2b8 3c9The paste command is a ready-made solution:paste file1 file2 file3By default, the delimiter character between the columns is a tab key. The paste command provides a -d delimiter option. Everything after -d is treated as a list. For example, this paste rendition uses the pipe symbol and ampersand characters as a list:paste -d"|&" file1 file2 file3The command produces this output:1|a&7 2|b&8 3|c&9The pipe symbol character, |, is used between columns 1 and 2, while the ampersand, &, separates column 2 and 3. If the list is completely used, and if the paste command contains more files arguments, then paste starts at the beginning of the list.To satisfy our original requirement, paste provides a null character, \0, signifying no character. To prevent the shell from interpreting the character, it must also be quoted:
paste -d"\0" file1 file2 file3Process a String One Character at a TimeStill another user asked how to process a string in a shell script one character at a time. Certainly, advanced scripting languages such as Perl and Ruby can solve this problem, but the cut command's -b option, which specifies the byte position, is a simple alternative:
#!/bin/ksh mystring="teststring" length=${#mystring} count=0 until [ $count -eq $length ] do ((count+=1)) char=$(echo $mystring|cut -b"$count") echo $char doneIn the stub above, string mystring's length is determined using the advanced pattern-matching capabilities of the bash and ksh shells. Any number of external Unix commands can provide a string length, but probably the command with the smallest foot print is expr:length=$(expr "$mystring" : '.*')Also, the bash shell contains a substring expansion parameter:${parameter:offset:length}According to the bash man page, the substring expansion expands "up to length characters of parameter starting at the character specified offset". Note that the offset starts counting from zero:#!/bin/bash mystring="teststring" length=${#mystring} ol=1 offset=0 until [ $offset -eq $length ] do echo "${mystring:${offset}:${ol}}" ((offset+=1)) done # end scriptDeleting a File Named dashFinally, a novice inadvertently created a file named with the single character dash, and asked us how to delete the file. No matter how he escaped the dash in the rm command, it still was considered an rm option.
It's easy enough to create the file using the touch command:
touch -To remove it, use a path to the file -- either full or relative. Assuming the dash file exists in the mydir directory, provide a full path to the file:rm /pathto/mydir/-Or if the file exists in the current directory, provide a relative path:rm ./-Of course, our old friend find can clobber that file everywhere:find . -name "-" |xargs rm
developer.apple.com
Shell scripts can be powerful tools for writing software. Graphical interfaces notwithstanding, they are capable of performing nearly any task that could be performed with a more traditional language. This chapter describes several techniques that will help you write more complex software using shell scripts.
- "Using the eval Builtin for Data Structures, Arrays, and Indirection" describes how to create complex data structures in shell scripts.
- "Shell Text Formatting" tells how to do tabular layouts and use ANSI escape sequences to add color and styles to your terminal output.
- "Trapping Signals" tells how to write signal handlers in shell scripts.
- "Nonblocking I/O" and "Timing Loops" show one way to write complex interactive scripts such as games.
- "Background Jobs and Job Control" explains how to do complex tasks in the background while your script continues to execute, including how to perform some basic parallel computation. It also explains how to obtain the result codes from these jobs after they exit.
- "Application Scripting With osascript" describes how your script can interact with Mac OS X applications using AppleScript.
- "Scripting Interactive Tools Using File Descriptors" describes how you can make bidirectional connections to command-line tools.
- "Networking With Shell Scripts" describes how to use the
nc
tool (otherwise known as netcat) to write shell scripts that take advantage of TCP/IP sockets.
Once upon a time, Unix had only one shell, the Bourne shell, and when a script was written, the shell read the script and executed the commands. Then another shell appeared, and another. Each shell had its own syntax and some, like the C shell, were very different from the original. This meant that if a script took advantage of the features of one shell or another, it had to be run using that shell. Instead of typing:
doit
The user had to know to type:
/bin/ksh doit
or:
/bin/csh doit
To remedy this, a clever change was made to the Unix kernel -- now a script can be written beginning with a hash-bang (
#!
) combination on the first line, followed by a shell that executes the script. As an example, take a look at the following script, nameddoit
:
#! /bin/ksh
#
# do some script here
#
In this example, the kernel reads in the script
doit
, sees the hash-bang, and continues reading the rest of the line, where it finds/bin/ksh
. The kernel then starts the Korn shell withdoit
as an argument and feeds it the script, as if the following command had been issued:
/bin/ksh doit
When
/bin/ksh
begins reading in the script, it sees the hash-bang in the first line as a comment (because it starts with a hash) and ignores it. To be run, the full path to the shell is required, as the kernel does not search yourPATH
variable. The hash-bang handler in the kernel does more than just run an alternate shell; it actually takes the argument following the hash-bang and uses it as a command, then adds the name of the file as an argument to that command.You could start a Perl script named
doperl
by using the hash-bang:
#! /bin/perl
# do some perl script here
If you begin by typing
doperl
, the kernel spots the hash-bang, extracts the/bin/perl
command, then runs it as if you had typed:
/bin/perl doperl
There are two mechanisms in play that allow this to work. The first is the kernel interpretation of the hash-bang; the second is that Perl sees the first line as a comment and ignores it. This technique will not work for scripting languages that fail to treat lines starting with a hash as a comment; in those cases, it will most likely cause an error. You needn't limit your use of this method to running scripts either, although that is where it's most useful.
The following script, named
helpme
, types itself to the terminal when you enter the commandhelpme
:
#! /bin/cat
vi unix editor
man manual pages
sh Bourne Shell
ksh Korn Shell
csh C Shell
bash Bourne Again Shell
This kernel trick will execute one argument after the name of the command. To hide the first line, change the file to use
more
by starting at line 2, but be sure to use the correct path:
#! /bin/more +2
vi unix editor
man manual pages
sh Bourne Shell
ksh Korn Shell
csh C Shell
bash Bourne Again Shell
Typing
helpme
as a command causes the kernel to convert this to:
/bin/more +2 helpme
Everything from line 2 onward is displayed:
helpme
vi unix editor
man manual pages
sh Bourne Shell
ksh Korn Shell
csh C Shell
bash Bourne Again Shell
etc.
You can also use this technique to create apparently useless scripts, such as a file that removes itself:
#! /bin/rm
If you named this file
flagged
, running it would cause the command to be issued as if you had typed:
/bin/rm flagged
You could use this in a script to indicate that you are running something, then execute the script to remove it:
#! /bin/ksh
# first refuse to run if the flagged file existsif [-f flagged ]
then
exit
fi# create the flag file
echo "#! /bin/rm" >flagged
chmod a+x flagged# do some logic here
# unflag the process by executing the flag file
flagged
Before you begin building long commands with this technique, keep in mind that systems often have an upper limit (typically 32 characters) on the length of the code in the
#!
line.Testing command line arguments and usage
When you write a shell script, arguments are commonly needed for it to function properly. In order to ensure that those arguments make sense, it's often necessary to validate them.
Testing for enough arguments is the easiest method of validation. For example, if you've created a shell script that requires two file names to operate, test for at least two arguments on the command line. To do this in the Bourne and Korn shells, check the value of
$#
-- a variable that contains the count of arguments, other than the command itself. It is also good practice to include a message detailing the reasons why the command failed; this is usually created in a usage function.The script
twofiles
below tests for two arguments on the command line:
#! /bin/ksh
# twofile script handles two files named on the command line
# a usage function to display help for the hapless user
usage ()
{
echo "twofiles"
echo "usage: twofiles file1 file2"
echo "Processes two files"
}# test if we have two arguments on the command line
if [ $# != 2 ]
then
usage
exit
fi# we are ok at this point so continue processing here
A safer practice is to validate as much as you can before running your execution. The following version of
twofiles
checks the argument count and tests both files. If file 1 doesn't exist (if [ 1 ! -f $1 ]
) an error message is set up, a usage is displayed, and the program exits. The same is done for file 2:
#! /bin/ksh
# twofile script handles two files named on the command line
# a usage function to display help for the hapless user
# plus an additional error message if it has been filled in
usage ()
{
echo "twofiles"
echo "usage: twofiles file1 file2"
echo "Processes two files"
echo " "
echo $errmsg
}# test if we have two arguments on the command line
if [ $# != 2 ]
then
usage
exit
fi# test if file one exists and send an additional error message
# to usage if not foundif [ ! -f $1 ]
then
errmsg=${1}":File Not Found"
usage
exit
fi# same for file two
if [ ! -f $2 ]
then
errmsg=${2}":File Not Found"
usage
exit
fi
# we are ok at this point so continue processing here
Note that in the Korn shell you can also use the double bracket test syntax, which is faster. The single bracket test actually calls a program named
test
to test the values, while the double bracket test is built into the Korn shell and does not have to call a separate program.The double bracket test will not work in the Bourne shell:
if [[ $# != 2 ]]or
if [[ ! -f $1 ]]
or
if [[ ! -f $2 ]]
This thorough validation can prevent later errors in the program logic when a file is suddenly found missing. Consider it good programming practice.
Searching the Past
Up
lots (and lots)
Ctrl+R
searches previous lines
Ctrl+R zip Esc
doesn't find the last zip
command - it also matches any line that copied, deleted,
unzipped, or did anything else with a zip file!
and a command name
!gv
opens gvim
instead of gv
Up
Up
so many timesUp
and Down
is configured in .inputrc
"\e[A": history-search-backward "\e[B": history-search-forward
Ctrl+P
and Ctrl+N
Left
and Right
from working, fix them like this:
"\e[C": forward-char "\e[D": backward-char
Space
before Enter
if necessary.inputrc
Space
does$if Bash Space: magic-space $endif
Meta+O
can be made to load the previous command and position the cursor for typing an option.inputrc
:
"\M-o": "\C-p\C-a\M-f "
Ctrl+P
: previous lineCtrl+A
: start of lineMeta+F
: forward a word, past the command␣
: insert a spaceCtrl
or Meta
modifiersI can give one practical purpose for this error redirection which I use on a regular basis. When I am searching for a file in the whole hard disk as a normal user, I get a lot of errors such as :find: /file/path: Permission deniedIn such situations I use the error redirection to weed out these error messages as follows:
# find / -iname \* 2> /dev/nullNow all the error messages are redirected to /dev/null device and I get only the actual find results on the screen.
Note: /dev/null is a special kind of file in that its size is always zero. So what ever you write to that file will just disappear. The opposite of this file is /dev/zero which acts as an infinite source. For example, you can use /dev/zero to create a file of any size - for example, when creating a swap file for instance.
I've been using this grep invocation for years to trim comments out of config files. Comments are great but can get in your way if you just want to see the currently running configuration. I've found files hundreds of lines long which had fewer than ten active configuration lines, it's really hard to get an overview of what's going on when you have to wade through hundreds of lines of comments.
$ grep ^[^#] /etc/ntp.confThe regex ^[^#] matches the first character of any line, as long as that character that is not a #. Because blank lines don't have a first character they're not matched either, resulting in a nice compact output of just the active configuration lines.
INDEX 6) Small tricks, aliases and other bit 'n' pieces
This is a list of small ``tricks'' that can be incorperated into your own
.cshrc/.login startup files.i) Show only new MOTD (messages of the the day) on login
if (-f /etc/motd ) then
cmp -s /etc/motd ~/.hushlogin
if ($status) tee ~/.hushlogin < /etc/motd
endifii) Changing the prompt to reflect the current directory
alias setprompt 'set prompt = "`pwd` > "'
alias cd 'chdir \!* && setprompt'
alias pushd 'pushd \!* && setprompt'
alias popd 'popd \!* && setprompt'
setpromptiii) Searching for a particular process (given as argument)
WARNING this is for a SunOS environment and may be different for
other OS's.alias pf 'ps auxgww|awk '\''/(^| |\(|\/)\!:1( |\)|$)/'\''|cut -c1-15,36-99'
iv) Multiline prompt
alias setprompt 'set prompt="\\
${hostname:h}:${cwd}\\
\! % "'v) Log remote (rsh) non-interactive commands executed in this account.
add something like the following to your .cshrc (non-interactive part)if ( ! $?prompt ) then
# Record the
set column = "`ps ww1 | head -1`" # figure out column from ps header
set column = `expr "$column" : '\(.*\)COMMAND' : '.*' + 1`
ps ww$$ | tail -1 | cut -c${column}- >> ~/command.log
exit
endifvi) Csh Function Scripts.
Scripts which are executed by the current shell as if internal This
allows more complex setprompt scripts, and for scripts to change the
prompt, set environment variables or change the current directory.# Csh function scripts
alias function 'set argv=(\!*); shift; source \!:1'# Specific Csh function
alias setprompt function ~/bin/scripts/setprompt# Directory of Csh functions (initialization)
foreach i (~/bin/csh.functions/*)
alias $i:t function $i
endvii) File/Directory mailing Aliases
Mail files, binaries, and directories to other people easily
Usage: mailfile addresss filealias a alias
a mailfile 'cat ~/lib/line-cut \!:2 ~/lib/line-cut |\\
/usr/ucb/mail -s "file \!:2" \!:1'
a mailuu 'uuencode \!:2 \!:2 | cat ~/lib/line-cut - ~/lib/line-cut |\\
/usr/ucb/mail -s "file \!:2.uu" \!:1'
a maildir 'tar cvf - "\!:2" | compress | uuencode "\!:2.tar.Z" |\\
cat ~/lib/line-cut - ~/lib/line-cut |\\
/usr/ucb/mail -s "file \!:2.tar.Z.uu" \!:1'# Multiple file "tar mail"
# Usage: tarmail address "subject" file...
a tarmail 'tar cvf - \!:3* | compress | uuencode tarmail.tar.Z |\\
cat ~anthony/lib/line-cut - ~anthony/lib/line-cut |\\
/usr/ucb/mail -s "\!:2" \!:1'-- miscellaneous sources
------------------------------------------------------------------------------
INDEX 7) Disclamer: Csh Script Programming Considered HarmfulThere are plenty of reasons not to use csh for script writing.
See Csh Programming Considered Harmful
ftp://convex.com/pub/csh.whynot
http://www.cit.gu.edu.au/~anthony/info/shell/csh.whynot-1.4
also
http://www.cit.gu.edu.au/~anthony/info/shell/csh.whynot.extraThis file is an attempt to explain how to make it easier and more convenient
to use it interactively. It does NOT provide a guide to using csh as a
general script writing language, and the authors recommend that it not be
used for that purpose.But why use csh interactively?
The aliases and history list alone make it worthwhile, extra features such
as file completion, tilde expansion, and job control make even more useful.
The tcsh command line editing and other interactive enhancements make it one
of the best interactive shells around.There are arguably `better' shells avaliable that can be used, but I have
found many of them lacking in some important aspect or, generally not
installed on most systems. A delivered vanilla machine however, is almost
certain to have csh. A .cshrc and .login setup can then be easily copied
and is available immediately.Faced with the choice between plain sh and bad old csh I'll take csh any
day.-- Paul Davey ([email protected])
-- Anthony Thyssen ([email protected])
When writing a shell program, you often come across some special situation that you'd like to handle automatically. This tutorial includes examples of such situations from small Bourne shell scripts. These situations include base conversion from one string to another (decimal to hex, hex to decimal, decimal to octal, and so on), reading the keyboard while in a piped loop, subshell execution, inline input, executing a command once for each file in a directory, and multiple ways to construct a continuous loop.
Part 4 of this series wraps up with a collection of shell one-liners that perform useful functions.
The dirs command output isn't that readable with many long pathnames. To make it more readable,
you could just use the -p option on dirs:
alias dirs="dirs -p"
the last argument
The arguments to a script (or function) are$1, $2, ...
and can be referred to as a group by$*
(or$@
). But is there an easy way to refer to the last argument in list ? Try${!#}
as in:echo ${!#} LAST=${!#}
a=1 echo $a # 1 a+=5 # Won't work under versions of Bash earlier than 3.1. echo $a # 15 a+=Hello echo $a # 15Hello |
Here, += functions as a string concatenation operator. Note that its behavior in this particular context is different than within a let construct.
#!/bin/bash variable="This is a fine mess." echo "$variable" if [[ "$variable" =~ "T*fin*es*" ]] # Regex matching with =~ operator within [[ double brackets ]]. then echo "match found" # match found fi |
Alternatively, the script can test for the presence of i in the $- flag.
1 case $- in 2 *i*) # interactive script 3 ;; 4 *) # non-interactive script 5 ;; 6 # (Thanks to "UNIX F.A.Q.", 1993)
Scripts may be forced to run in interactive mode with the i option or with a #!/bin/bash -i header. Be aware that this may cause erratic script behavior or show error messages where no error is present.
1 # Append (>>) following to end of save file. 2 date>> $SAVE_FILE #Date and time. 3 echo $0>> $SAVE_FILE #Script name. 4 echo>> $SAVE_FILE #Blank line as separator. 5 # Of course, SAVE_FILE defined and exported as environmental variable in ~/.bashrc 6 # (something like ~/.scripts-run) |
IFS Specifies internal field separators (normally space, tab, and new line) used to separate command words that result from command or parameter substitution and for separating words with the regular built-in command read. The first character of the IFS parameter is used to separate arguments for the $* substitution.
... ... ...
basename strips off the path leaving only the final component of the name, which is assumed to be the file name. If you specify suffix and the remaining portion of name contains a suffix which matches suffix, basename removes that suffix. For example
basename src/dos/printf.c .cproduces
printfdirname
returns the directory part of the full path+name combination.Also can be done directly in bash
basename=${file##*/}
dirname=${file%/*}
fname=${file%.rbl}
Date: Tue, 12 Jan 1999 19:18:15 +0200
From: Reuben Sumner, [email protected]Here is a two cent tip that I have been meaning to submit for a long long time now.
If you have a large stack of CD-ROMS, finding where a particular file lies can be a time consuming task. My solution uses the locate program and associated utilities to build up a database of the CDs' contents that allows for rapid searching.
First we need to create the database, the following script does the trick nicely.
#!/bin/bash onedisk() { mount /mnt/cdrom find /mnt/cdrom -maxdepth 7 -print | sed "s;^/mnt/cdrom;$1;" > $1.find eject -u cdrom } echo Enter name of disk in device: read diskname while [ -n "$diskname" ]; do onedisk $diskname echo Enter name of next disk or Enter if done: read diskname done echo OK, preparing cds.db cat *.find | sort -f | /usr/lib/findutils/frcode > cds.db echo Done...Start with no CD mounted. Run the script. It will ask for a label for the CD, a short name like "sunsite1" is best. It will then quickly scan the CD, eject it and prompt for another. When you have exhausted your collection just hit enter at the prompt. A file called cds.db will be done. To make it simple to use copy cds.db to /var/lib (or anywhere else, that is where locatedb is on my system). Now create an alias likealias cdlocate="locate -d /var/lib/cds.db"Now if I type "cdlocate lyx" I getdebian20_contrib/debian/hamm/contrib/binary-i386/text/lyx_0.12.0.final-0.1.deb debian20_contrib/debian/hamm/contrib/binary-m68k/text/lyx_0.12.0.final-0.1.deb debian20_contrib/debian/hamm/contrib/source/text/lyx_0.12.0.final-0.1.diff.gz debian20_contrib/debian/hamm/contrib/source/text/lyx_0.12.0.final-0.1.dsc debian20_contrib/debian/hamm/contrib/source/text/lyx_0.12.0.final.orig.tar.gz lsa3/apps/wp/lyx-0.12.0-linux-elf-x86-libc5-bin.tar.gz lsa3/apps/wp/lyx-0.12.0.lsm lsa3/apps/wp/lyx-0.12.0.tar.gz lsa4/docs/french/www.linux-france.com/lgazette/issue-28/gx/lyx lsa4/powertools/i386/lyx-0.12.0-1.i386.rpm lsa4/powertools/SRPMS/lyx-0.12.0-1.src.rpm openlinux12/col/install/RPMS/lyx-0.11.32-1.i386.rpm openlinux12/col/sources/SRPMS/lyx-0.11.32-1.src.rpm suse53/suse/contents/lyxIn order to prevent locate from warning you that the database is old try touch -t 010100002020 /var/lib/cds.db to set the modification date to January 1 2020.--
Reuben
Ever wondered what's inside some of those binary files on your system (binary executables or binary data)? Several times I've gotten error messages from some command in the Solaris system, but I couldn't tell where the error was coming from because it was buried in some binary executable file.
The Solaris "strings" command lets you look at the ASCII text buried inside of executable files, and can often help you troubleshoot problems. For instance, one time I was seeing error messages like this when a user was trying to log in:
Could not set ULIMIT
I finally traced the problem down to the /bin/login command by running the "strings" command like this:
root> strings /bin/login | more
The strings command lists ASCII character sequences in binary files, and help me determine that the "Could not set ULIMIT" error was coming from this file. Once I determined that the error message I was seeing was coming from this file, solving the problem became a simple matter.
If you're like many Solaris users and administrators, you spend a lot of time moving back and forth between directories in similar locations. For instance, you might often work in your home directory (such as "/home/al"), the /usr/local directories, web page directories, or other user's home directories in /home.
If you're often moving back-and-forth between the same directories, and you use the Bourne shell (sh) or Korn shell (ksh) as your login shell, you can use the CDPATH shell variable to save yourself a lot of typing, and quickly move between directories.
Here's a quick demo. First move to the root directory:
cd /
Next, if it's not set already, set your CDPATH shell variable as follows:
CDPATH=/usr/spool
Then, type this cd command:
cd cron
What happens? Type this and see what happened:
pwd
The result should be "/usr/spool/cron".
When you typed "cd cron", the shell looked in your local directory for a sub-directory named "cron". When it didn't find one, it searched the CDPATH variable, and looked for a "cron" sub-directory. When it found a sub-directory named cron in the /usr/spool directory, it moved you there.
You can set your CDPATH variable just like your normal PATH variable:
CDPATH=/home/al:/usr/local:/usr/spool:/home
Have you ever needed to run a series of commands, and pipe the output of all of those commands into yet another command?
For instance, what if you wanted to run the "sar", "date", "who", and "ps -ef" commands, and wanted to pipe the output of all three of those commands into the "more" command? If you tried this:
sar -u 1 5; date; who; ps -ef | more
you'll quickly find that it won't work. Only the output of the "ps -ef" command gets piped through the "more" command, and the rest of the output scrolls off the screen.
Instead, group the commands together with a pair of parentheses (and throw in a few echo statements for readability) to get the output of all these commands to pipe into the more command:
(sar -u 1 5; echo; who; echo; ps -ef; echo; date; echo) | more
Many times it's necessary to schedule programs to run at a later time. For instance, if your computer system is very busy during the day, you may need
to run jobs late at night when nobody is logged on the system.Solaris makes this very easy with the "at" command. You can use the "at" command to run a job at almost any time--later today, early tomorrow...whenever.
Suppose you want to run the program "my_2_hour_program" at ten o'clock tonight. Simply tell the at command to run the job at 10 p.m. (2200):
/home/al> at 2200
at> my_2_hour_program > /tmp/2hour.out
at> <CTRL><D>
warning: commands will be executed using /bin/ksh
job 890193600.a at Tue Mar 17 22:00:00 1998Or suppose you'd like to run a find command at five o'clock tomorrow morning:
/home/al> at 0500 tomorrow
at> find /home > /tmp/find.out
at> <CTRL><D>
warning: commands will be executed using /bin/ksh
job 890215200.a at Wed Mar 18 05:00:00 1998When you're at the "at" prompt, just type the command you want to run. Try a few tests with the at command until you become comfortable with the way
it works.
Question: How often do you create a new directory and then move into that directory in your next command? Answer: Almost always.
I realized this trend in my own work habits, so I created a simple shell function to do the hard work for me.
md () {
mkdir -p $1 && cd $1
}This is a Bourne shell function named "md" that works for Bourne and Korn shell users. It can be easily adapted for C shell users.
Taking advantage of the -p option of the mkdir command, the function easily creates multi-level subdirectories, and moves you into the lowest level of the directory structure. You can use the command to create one subdirectory like this:
/home/al> md docs
/home/al/docs> _or you can create an entire directory tree and move right into the new directory like this:
/home/al> md docs/memos/internal/solaris8
/home/al/docs/memos/internal/solaris8>
Date: Fri, 15 Jan 1999 19:55:51 +0100 (CET)
From: JL Hopital, [email protected]My English is terrible,so feel free to correct if you decide to publish...
Hello,i am a French linuxer and here is my two cent tips. If you have many CD-ROMs and want to retrieve this_file_I'm_sure_i_have_but_can't_remember_where, it can helps.
It consist of 2 small scripts using gnu utilities: updatedb and locate. Normally 'updatedb' run every night, creating a database for all the mounted file systems and 'locate' is used to query this system-wide database.But you can tell them where are the files to index and where to put the database.That's what my scripts does:
The first script (addcd.sh) create a database for the cd actually mounted.You must run it once for every cdrom.
The second ( cdlocate.sh ) search in the databases created by addcd.sh and display the cdname and full path of the files matching the pattern you give in parameter. So you can search for unmounted files !
To use:
Beware that locate's regular expressions have some peculiarities, 'man locate' will explain.
- create a directory and copy in it the 2 scripts
mkdir /home/cdroms cp addcd.sh cdlocate.sh /home/cdroms- mount the first cdrom you want to index
mount /mnt/cdrom( if your mount point is different , you must adapt the script )- run addcd.sh with a fully descriptive name for this cdrom as parameter (this description will be used as part of the database name ,don't use space):
./addcd.sh Linux.Toolkit.Disk1.Oct.1996It will take some time to updatedb to create the databases specially if the cdrom contain many files.- umount the cdrom and go to step 2 for all the cdroms you want or every time you've got a new one(I have more than 70 databases created this way).
- you can now use cdlocate.sh,to retrieve files
./cdlocate.sh '*gimp*rpm'Hope this help and happy linuxing !
---Cut here------------------------------ # addcd.sh # Author: [email protected] # Create a filename's database in $DATABASEHOME for the cd mounted # at $MOUNTPOINT # Example usage: addcd.sh Linux.Toolkit.Disk3.Oct.1996 # to search the databases use cdlocate.sh CDNAME=$1 test "$CDNAME" = "" && { echo Usage:$0 name_of_cdrom ; exit 1 ; } # the mount point for the cd-ROM MOUNTPOINT=/mnt/cdrom # where to put the database DATABASEHOME=/home/cdroms updatedb --localpaths=$MOUNTPOINT --output=$DATABASEHOME/$CDNAME.updatedb && \ echo Database added for $CDNAME ---Cut here-------------------------------- # cdlocate.sh # Author : [email protected] # Usage $0 pattern # search regular expression in $1 in the database's found in $DATABASEHOME # to add a database for a new cd-rom , use addcd.sh test "$*" = "" && { echo Usage:$0 pattern ; exit 1 ; } DATABASEHOME=/home/cdroms cd $DATABASEHOME # get ride of locate warning:more than 8 days old touch *.updatedb CDROMLIST=`ls *.updatedb` for CDROM in $CDROMLIST do CDROMNAME=`basename $CDROM .updatedb` locate --database=$DATABASEHOME/$CDROM $@ |sed 's/^/'$CDROMNAME:'/' done
Google matched content |
Please visit Heiner Steven SHELLdorado the best shell scripting site on the Internet |
SHELLdorado - Shell Tips & Tricks
SHELLdorado - Newsletter Archive -- a lot of very useful tips
bash Cookbook Reader - Contributions browse
10 Essential UNIX-Linux Command Cheat Sheets TECH SOURCE FROM BOHOL
Top 10 Best Cheat Sheets and Tutorials for Linux - UNIX Commands
My 10 UNIX Command Line Mistakes
Solaris IAOQ (INFREQUENTLY ASKED AND OBSCURE QUESTIONS )
String expansion
1 #!/bin/bash 2 3 # String expansion. 4 # Introduced in version 2 of bash. 5 6 # Strings of the form $'xxx' 7 # have the standard escaped characters interpreted. 8 9 echo $'Ringing bell 3 times \a \a \a' 10 echo $'Three form feeds \f \f \f' 11 echo $'10 newlines \n\n\n\n\n\n\n\n\n\n' 12 13 exit
Indirect variable references - the new way
1 #!/bin/bash 2 3 # Indirect variable referencing. 4 # This has a few of the attributes of references in C++. 5 6 7 a=letter_of_alphabet 8 letter_of_alphabet=z 9 10 # Direct reference. 11 echo "a = $a" 12 13 # Indirect reference. 14 echo "Now a = ${!a}" 15 # The ${!variable} notation is greatly superior to the old "eval var1=\$$var2" 16 17 echo 18 19 t=table_cell_3 20 table_cell_3=24 21 echo "t = ${!t}" 22 table_cell_3=387 23 echo "Value of t changed to ${!t}" 24 # Useful for referencing members 25 # of an array or table, 26 # or for simulating a multi-dimensional array. 27 # An indexing option would have been nice (sigh). 28 29 30 exit 0
Using arrays and other miscellaneous trickery to deal four random hands from a deck of cards
1 #!/bin/bash2 2 # Must specify version 2 of bash, else might not work. 3 4 # Cards: 5 # deals four random hands from a deck of cards. 6 7 UNPICKED=0 8 PICKED=1 9 10 DUPE_CARD=99 11 12 LOWER_LIMIT=0 13 UPPER_LIMIT=51 14 CARDS_IN_SUITE=13 15 CARDS=52 16 17 declare -a Deck 18 declare -a Suites 19 declare -a Cards 20 # It would have been easier and more intuitive 21 # with a single, 3-dimensional array. Maybe 22 # a future version of bash will support 23 # multidimensional arrays. 24 25 26 initialize_Deck () 27 { 28 i=$LOWER_LIMIT 29 until [ $i -gt $UPPER_LIMIT ] 30 do 31 Deck[i]=$UNPICKED 32 let "i += 1" 33 done 34 # Set each card of "Deck" as unpicked. 35 echo 36 } 37 38 initialize_Suites () 39 { 40 Suites[0]=C #Clubs 41 Suites[1]=D #Diamonds 42 Suites[2]=H #Hearts 43 Suites[3]=S #Spades 44 } 45 46 initialize_Cards () 47 { 48 Cards=(2 3 4 5 6 7 8 9 10 J Q K A) 49 # Alternate method of initializing array. 50 } 51 52 pick_a_card () 53 { 54 card_number=$RANDOM 55 let "card_number %= $CARDS" 56 if [ ${Deck[card_number]} -eq $UNPICKED ] 57 then 58 Deck[card_number]=$PICKED 59 return $card_number 60 else 61 return $DUPE_CARD 62 fi 63 } 64 65 parse_card () 66 { 67 number=$1 68 let "suite_number = number / CARDS_IN_SUITE" 69 suite=${Suites[suite_number]} 70 echo -n "$suite-" 71 let "card_no = number % CARDS_IN_SUITE" 72 Card=${Cards[card_no]} 73 printf %-4s $Card 74 # Print cards in neat columns. 75 } 76 77 seed_random () 78 { 79 # Seed random number generator. 80 seed=`eval date +%s` 81 let "seed %= 32766" 82 RANDOM=$seed 83 } 84 85 deal_cards () 86 { 87 echo 88 89 cards_picked=0 90 while [ $cards_picked -le $UPPER_LIMIT ] 91 do 92 pick_a_card 93 t=$? 94 95 if [ $t -ne $DUPE_CARD ] 96 then 97 parse_card $t 98 99 u=$cards_picked+1 100 # Change back to 1-based indexing (temporarily). 101 let "u %= $CARDS_IN_SUITE" 102 if [ $u -eq 0 ] 103 then 104 echo 105 echo 106 fi 107 # Separate hands. 108 109 let "cards_picked += 1" 110 fi 111 done 112 113 echo 114 115 return 0 116 } 117 118 119 # Structured programming: 120 # entire program logic modularized in functions. 121 122 #================ 123 seed_random 124 initialize_Deck 125 initialize_Suites 126 initialize_Cards 127 deal_cards 128 129 exit 0 130 #================ 131 132 133 134 # Exercise 1: 135 # Add comments to thoroughly document this script. 136 137 # Exercise 2: 138 # Revise the script to print out each hand sorted in suites. 139 # You may add other bells and whistles if you like. 140 141 # Exercise 3: 142 # Simplify
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|
You can use PayPal to to buy a cup of coffee for authors of this site |
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Last modified: May 23, 2021