Linux Blog

Shell Script Sundays

Filed under: Shell Script Sundays,The Linux Blog News — at 4:30 am on Sunday, October 10, 2010

I’ve been trying to keep up this blog for a number of years now. Problem is, I think I over committed when I created the Shell Script Sundays column. I’ve been keeping it going as best as possible, and will continue to do so, but I’m not going to be able to do it every week. If you may have noticed, I haven’t done any on a regular basis anyway. I do however, have some queued up for every other week, but I don’t know how long I’ll be able to continue doing that. There is also, only so much you can write about shell scripts. If anyone wants to step up and write some shell scripting articles you’re more than welcome to do so.

This will be my 70th article written since I’ve started. I doubt any one will even notice anyway, if you’d like to see it make a comeback, let me know.

That is all.

Bash Scripting Dry Run

Filed under: Quick Linux Tutorials,Shell Script Sundays — at 1:37 am on Sunday, September 12, 2010 when scripting it is desirable to not perform an action when modifying or creating a new script. In this case it is nice to be able to do a dry run similar to –dry-run for make.

When I’m making a script I’ve learned the hard way to:

  1. Make backups before hand
  2. Make backups while performing operations
  3. Perform mock dry runs by using echo liberally.

So next time you’re trying to do something complex and don’t want to do something goofy, stick an echo in there before running it and save yourself some time.

Image Source:

Remove lines that are in another file

Filed under: Shell Script Sundays — at 6:52 pm on Sunday, February 7, 2010

Remove lines from a file that exist in another fileI had an issue this week where I needed to remove lines from one file if they existed in another file. Looking back it was frustrating as such a task should be simple.

I tried all sorts of things. Differencing the two files and using grep to grab the lines I wanted. Whatever I tried just did not produce the expected results. Thanks to a buddy I found the solution which ended up being to sort the two files before using diff.

Assuming two files exist, File_1 and File_2. File_1 containing lines with a, b, c and. File_2 containing b and d. If we want to remove b and d from File_1 because they exist in File_2 you could use something like the this:

owen@linuxblog:~$ cat File_1.txt
owen@linuxblog:~$ cat File_2.txt
owen@linuxblog:~$ diff File_1.txt File_2.txt | grep \< | cut -d \  -f 2

That’s all fine and dandy until File_2.txt contains the same lines in a different order. Running the same command produces different results. See Below:

owen@linuxblog:~$ cat File_2.txt
owen@linuxblog:~$ diff File_1.txt File_2.txt | grep \< | cut -d-f 2

The solution as noted above is to use sort before hand and then difference them:

owen@linuxblog:~$ sort File_1.txt >> File_1-sorted; sort File_2.txt >> File_2-sorted;
owen@linuxblog:~$ diff File_1-sorted File_2-sorted | grep \< | cut -d \  -f 2

Obviously the example has been simplified, when dealing with thousands of lines the sort could take a while. With that said I’m sure there are more efficient ways to achieve the same results. I wouldn’t doubt there being a command better suited to do this. Have at it in the comments.

BackTrack Persistent “Press Enter Fix”

Filed under: General Linux,Shell Script Sundays — at 5:37 pm on Sunday, November 1, 2009

One of the things that has irritated me about the persistent USB thumb drive installs is the “Press Enter” to continue prompt on shutdown. Luckily, if you have persistent working correctly, the fix for this is easy.

Open up /etc/rc0.d/S89casper
Search for ENTER and the shutdown string “Please remove the disc and close the tray (if any) then press ENTER: ” > /dev/console”
For me, this was on line 89. Comment out, that line and the lines down to the ending bracket for the do_stop() routine.

Reboot, and see if you have that annoying message, if not you’re good to go.

Remotely Changing Windows Volume

Filed under: Shell Script Sundays — at 2:41 pm on Sunday, October 18, 2009

This is not really “shell scripting” but the end result is some more bash scripts in my bin directory so what the hell? It’s going in the shell script section because its Sunday. So what?

I like to listen to music on my Windows box while I work on my Linux box. Online radio and other sounds, just get in the way too much. One of the things I wanted to do for a while was remotely control my volume so I didn’t have to use my KVM to switch over to change the volume when ever anyone came in my office.

Its actually pretty easy to control your windows volume from Linux.

At first I thought, I’d create a dummy audio device, and some how map it over. Then I figured that was overkill and I’d try something a bit easier. I have SSH via Cygwin, so all I needed was a way to control the volume locally, and I could execute the command with SSH. Having no volume utilities jump at me when I looked through the Cygwin repositories I went to look for something else.

NirCmd is an awesome utility, giving me and other Windows users the ability to do things that Linux users may take for granted, you can read about it here: after installing it, and making sure that my corporate AV didn’t throw a hissy, it was just a matter of dumping some scripts in my bin directory and chmodding them so they would run.

Here is what they look like:

Volume Down Script: ssh windowsbox -l owen -C “nir changesysvolume -2000″

Volume Up Script: ssh windowsbox -l owen -C “nir changesysvolume 2000″

Mute: ssh windowsbox -l owen -C “nir mutesysvolume 1″

Unmute: ssh windowsbox -l owen -c “nir mutesysvolume 0″

Real simple, and the mute/unmute really comes in handy for when some one walks into my office.

Searching for multiple strings with grep

Filed under: Shell Script Sundays — at 9:36 pm on Sunday, August 23, 2009

Sometimes when using grep it’s nice to be able to search for more than one string in a file. It’s actually pretty easy to do, if you can remember the simple syntax. Basically, you pipe the terms together in double quotes and escape the pipe.

grep "gnome\|kde" install.log

The above example will search for gnome or KDE. It works for more more than one term, and also works with the invert match -v option, to exclude lines that include gnome or kde.

Todays article extra short, and late, but better late and short than none at all I guess.

Sequences with seq

Filed under: Quick Linux Tutorials,Shell Script Sundays — at 9:28 pm on Sunday, July 19, 2009

I’m going to keep this one short since time I’ve had a serious lack of time recently. If you need to free up some time by putting repetitive commands into loops, you can use the command seq.

Basically, you can use any generic loop such as the while or for loop. Here are two examples:

for i in `seq 1 10`; do echo $i; done;
seq 1 10 | while read i; do echo $i; done;

These both count and echo from 1 to ten. Replace the echo $i; with whatever it is you want to do. You can use sleep, to sleep for a number of seconds, and any other logic you wish here.

A Manpage Template for your Scripts

Filed under: Shell Script Sundays — at 11:44 am on Sunday, June 21, 2009

So, you just finished that killer script and the first thing you want to do is write that man page for it right? Not exactly? What do you mean?
Writing a man page isn’t that hard and will give your users an idea of how to use a script.

Here is a basic template for creating your own man pages:

.\" This is a comment
.\" Contact
.TH man 1 "21 June 2009" ".1" "Man Page Example"
Man \- Example Man Page
Example Man Page
This is an example of a man page.
No options for this man page
man(1), groff(1), man.config(5)
No known bugs for this template, except you might want to replace the quotes if you copy from my blog.
Owen (

When you’re done with the gruelling task of writing documentation (which your users won’t read or thank you for) just plop it the correct man section in /usr/share/man
That’s all there is to it!

Coppermine Photo gallery Upload Script

Filed under: Shell Script Sundays — at 9:15 pm on Sunday, May 31, 2009

This week I bring you a script that I helped Kaleb (who has written posts here before) write. Well, I got him started with it, using curl and he rolled with it and finished it up. Here is the script:

# Script to Upload to
# Written by Kaleb Porter May 23 2009
# with help of
# email:
# if you wish to use this code for something else please give me credit
DA=`date '+%d%b%y-%N'`
# If the user does not specify a file or url
if [ -z "$1" ]; then
echo "You did not give a file to upload"
echo "Takeing a screenshot in 3 Seconds..."
sleep 3
scrot $DA.png
FI=`echo "$IMAGE" | grep '^[a-z]*://'`
FIUP=`curl -s -F control=phase_1 -F blaa=continue -F file_upload_array[]=@$IMAGE $URL | grep unique_ID | awk -F\" '{print $6}'`
URLUP=`curl -s -F control=phase_1 -F blaa=continue -F URI_array[]=$IMAGE $URL | grep unique_ID | awk -F\" '{print $6}'`
#Get the title for the image from the user and change all the spaces to "%20"
echo "Enter a title for the image"
read TITLE1
TITLE=`echo $TITLE1 | sed 's/ /\%20/g'`
#Get the Description for the image from the user and change the spaces to "%20"
echo "Enter a discription"
read DES1
DES=`echo $DES1 | sed 's/ /\%20/g'`
#Get the keywords for the image from the user and change the spaces to "%20"
echo "Enter keywords (separated by spaces)"
read KEY1
KEY=`echo $KEY1 | sed 's/ /\%20/g'`
if [ -z "$FI" ]; then
#echo "Choose the NUMBER value for the album  you want"
#curl -s -F control=phase_2 -F unique_ID=$UNIQUE_ID $URL | awk '/name="album"/{disp=1} {if (disp==1) print} /&lt;\/select&gt;/{disp=0}' | grep 'value="[0-9]"' | sed 's/&lt;option//' | sed 's/&lt;\/option&gt;//' | sed 's/&gt;//'
#read AL
curl -o /dev/null -d "control=phase_2&amp;unique_ID=$UNIQUE_ID&amp;album=$AL&amp;title=$TITLE&amp;caption=$DES&amp;keywords=$KEY&amp;blaa=continue" $URL
exit 0
# If the image is from a URL
#echo "Choose the NUMBER value for the album  you want"
#curl -s -F control=phase_2 -F unique_ID=$UNIQUE_ID $URL | awk '/name="album"/{disp=1} {if (disp==1) print} /&lt;\/select&gt;/{disp=0}' | grep 'value="[0-9]"' | sed 's/&lt;option//' | sed 's/&lt;\/option&gt;//' | sed 's/&gt;//'
#read AL
curl -o /dev/null -d "control=phase_2&amp;unique_ID=$UNIQUE_ID&amp;album=$AL&amp;title=$TITLE&amp;caption=$DES&amp;keywords=$KEY&amp;blaa=continue" $URL
exit 0

If there are any questions you can pretty much read the Shell Script Sundays column and figure out everything you need to know. Now that the upload script works, and tries to take a screenshot with scrot, the next step is a check to see if scrot exists, if it doesn’t a check for import, if not an error message.

It really does amaze me at the capabilities of the shell. Especially how mashable it is and how you can combine it with pretty much anything, this script is a great example of combining the power of the shell with the intrawebs. Well, I hoped you learned something, and as always if you have any questions, you know where the comment box is.

– Owen.

Last 50 Characters of Each line

Filed under: Shell Script Sundays — at 4:18 pm on Sunday, May 24, 2009

I got a question from a user called Bastiaan. He had found my site while searching for ‘cut from end of line Linux’ and landed on the Using cut – shellscript string manipulation article. I haven’t received a lot of feedback on it, but am happy with the feedback I have and the amount of visits it gets. As I’ve said before if no one else reads The Linux Blog I still use it as a reference, so I am glad people are finding it useful. Anyways, Bastiaan’s problem was he works in a University and has a file with A LOT of DNA records in it. He needed to grab the last 50 characters of each line, regardless of the line length. After some correspondence we came up with a solution.

I have experience in doing this sort of thing in other languages such as PHP but not bash. Here is what I came up with for bash:

cat find.txt | while read i; do echo $i | \
cut -b $((`echo $i | wc -c` - 50))-; done;

While this was really quick to write it is not the most efficient way in the world. It has to read each line, echo it out, calculate the length of the line, subtract 50 from it. Again, does the job but not very gracefully.

Bastiaan then had told me he reversed the whole file and then was processing that with cut. I have heard of tac, to reverse entire files, but not had never heard of rev. Using rev I assumed that he was running something like the following:

rev file.txt > rev_file.txt
cat rev_file.txt | cut -c -50 | rev

That will get you the last 50 characters from each line (well, really the first 50 of a reversed file) That works pretty good so the final solution was to try to stream line it a little bit so that it could be done in one step.

rev file.txt | cut -c -50 | rev > out.txt

So there you have it, if you’re looking to use cut to “cut” characters from the end of the line, the above will cut 50 characters off of the end. Obviously you can remove the last “> out.txt” to get the output on the screen.

Hope this helps some one, and thanks to Bastiaan for the question!

Bulk Editing Text Files

Filed under: Shell Script Sundays — at 1:00 am on Sunday, May 10, 2009

A Co-Worker wanted to edit a number of files in a directory that contained a lot of files. Each file that needed to be edited contained a function that needed to be replaced. Since it was production data we did not want to do a backup and run a sed find and replace for all files and risk screwing something up we decided to use vi to edit a list of files. Here is what I came up with to do that:

vi `grep function\_name * -n |cut -d : -f 1 | uniq`

If it were me, I would not have wanted to type sed find and replaces and would have done something like this because I’m lazy and I like to live on the edge:

 grep function\_name * -n | cut -d : -f 1 | uniq | while read i; do cp $i $i-bak; sed 's/function_name/new_function_name/g' $i-bak > $i; done;

Rather than editing them with vi it makes a -bak file, and uses sed to replace function_name with new_function_name. It does this from the bak file into the original. Some may think it’s kind of scary not making a backup, but I figure the -bak file should be enough depending on the operation. Make a backup if you value your data though.

« Previous PageNext Page »