Linux Blog

Recursive MD5 Sum Script

Filed under: Shell Script Sundays — TheLinuxBlog.com at 12:08 am on Sunday, December 9, 2007

This week I made this shell script to search one level deep and MD5 all of the files. I did this because I had multiple images and I wanted to see what images were the same so that I could merge them together. Its a pretty simple script & the output is the same as md5suming a file except there is more than one sum generated.

#MD5 Files in the directories
md5Dir () {
echo $directory;
for x in $(ls -1 $directory); do
md5sum $directory'/'$x;
done;
}
#Lists The Directories
for i in $(ls | grep active); do
directory=$i;
md5Dir;
done;

It only does one level deep but thats good enough for now. I am going to make it search recursively depending on the levels given by the user. I would also like to make it display files that are the same at the end.

It gets the job done for small directories, but if I wanted to run it on large multiple directories with lots of files in them I would definitely redirect the output to a file because it can be quite overwhelming. To run it just copy the code into a file and do the following:

sh [filename]

I hope this helps some one who is trying to MD5 multiple files in different directories!

Comments Are Back!

Filed under: The Linux Blog News — TheLinuxBlog.com at 4:03 am on Saturday, December 8, 2007

Ok, I’ve decided to add comments back to The Linux Blog. The idea behind commends is that people post comments for help, advice, questions, comments on the article or just to be nice. Before this was not happening so I turned them off. Now, I’ve re-added them back hoping that people will actually comment.

The spam problem has been fixed and we should not see any spammy comments since posters now have to be approved.

I’ll leave them on for a while and see how it does. In other news I’ve been writing like mad, trying to get some good articles written.

On the list of stuff to write are a couple of Shell Scripting articles, one about IP Soft Phones for Linux, Battery Life & Optimization, virtualization. I also have some tutorials that I would really like to write to help people out with WordPress and other web applications that run on open source software such as MediaWiki.

If you have any questions, or would like to request something, now you can actually just comment , so go ahead, leave a comment

My Linux Box has a new video card!

Filed under: General Linux,Linux Hardware — TheLinuxBlog.com at 12:10 am on Friday, December 7, 2007

NVIDIA GeForce 4 Ti 4200 AGP 8X Driver IssueI’ve got a new to me video card to temporarily service as my video card until I can get a cooling kit for my GeForce FX 5200. Its an older GeForce 4 Ti 4200 AGP8X. I thought it would be as simple as plugging it into my agp slot, turning the computer on and re-installing the NVIDIA driver module but I was wrong.

The problem is, since this is an older card I have to use a legacy driver:

WARNING: The NVIDIA GeForce4 Ti 4200 with AGP8X GPU installed in this system
is supported through the NVIDIA 1.0-96xx legacy Linux graphics
drivers. Please visit http://www.nvidia.com/object/unix.html for
more information. The 100.14.11 NVIDIA Linux graphics driver will
ignore this GPU.

If you can’t see the screen shot click it. Its basically a pretty version of the above error message that tells me that I need to use the: “NVIDIA 1.0-96xx legacy Linux Driver”. Here is the download page for the driver if your running into the same problem: http://www.nvidia.com/object/linux_display_x86_96.43.01.html

You can temporarily use the “nv” driver in your XORG configuration but be warned this is not accelerated so you should just use it to download the legacy driver, quit X and then install the accelerated one. Unfortunately I could not get links or lynx to download from nVidia’s site because of some strange javascript code. I find it Ironic that the Unix drivers page isn’t even compatible with the basic Unix browsers.

Slackware-Current Xfce 4.4.2 Updates

Filed under: General Linux,Linux Software — TheLinuxBlog.com at 1:33 pm on Wednesday, December 5, 2007

Slackware 12 - Current, Firefox, Xfce 4.4.2 Screenshot.I upgraded my Slackware Box to Slackware-Current yesterday and there are many updated packages. Here is the Slackware Current Changelog.

Included is Xfce 4.4.2, this is the newest Xfce and has many bug fixes and a minor security bug fix in terminal. For the entire list you can look at the Xfce – 4.4.2 changelog

Firefox 2.0.0.11 is in this release which is good because sometimes the current repository contains older versions of Firefox.

The screenshot is just a screenshot of the GetFireFox page. Also, you will notice the nice transparencies feature of Xfce. Click it for the larger version.

Slackware MPlayer Setup

Filed under: General Linux,Linux Software,The Linux Blog News — TheLinuxBlog.com at 12:03 am on Tuesday, December 4, 2007

I’ve updated the MPlayer Installer and uploaded it.

Instructions:

1. Download The MPlayer Installer that I made (Right click Save Link As).

2. Change to root user and run the MPlayer Installer OR run the MPlayer Installer with sudo as followed:

bash-3.1$ su root
 
Password:
 
bash-3.1# sh mplayer_setup.sh

OR

sudo sh mplayer_setup.sh

Changes to this version:
mplayer-1.0rc1try3-i486-2kjz.tgz is now being used as mplayer-1.0rc1-i486-1goa.tgz is no longer available.
Instead of using the essential codecs pack I am now using the all-20071007.tar.bz2 package. This includes so many more codecs. Even apple.com trailers work for me now.
A bunch of dependencies / libraries are now being downloaded and installed here is the list:

a52dec-0.7.4-i486-1kjz.tgz
libmpcdec-1.2.6-i486-1gds.tgz
libmpeg3-1.7-i486-1gds.tgz
avifile-20071003-i486-1gds.tgz
mpeg2dec-0.4.1-i486-1gds.tgz
faac-1.25-i486-1gds.tgz
faad2-2.6.1-i486-1wim.tgz
speex-1.2beta2-i486-1kjz.tgz
ffmpeg-20070622-i486-1kjz.tgz
twolame-0.3.10-i486-1kjz.tgz
jack-0.103.0-i486-1kjz.tgz
x264-20070722-i686-1mfb.tgz
lame-3.97-i486-1kjz.tgz
xmms-1.2.11-i486-1McD.tgz
libdv-1.0.0-i486-1gds.tgz
xvidcore-1.1.3-i486-1kjz.tgz
mplayerplug-in-3.45-i486-1kjz.tgz
libdc1394-1.2.2-i486-1gds.tgz

Notes:
The following error is caused because samba is not installed:

mplayer: error while loading shared libraries: libsmbclient.so.0: cannot open shared object file: No such file or directory

If you run into any other problems, as always you can e-mail me (owen @ thelinuxblog.com) for help.
Because of all the changes the line by line breakdown of the installer from: Slackware 12 – MPlayer Setup in 3 Easy Steps post is no longer accurate.

MPlayer Installer

Filed under: The Linux Blog News — TheLinuxBlog.com at 7:54 am on Monday, December 3, 2007

The current MPlayer Installer script that I uploaded to this site had a slight problem.

mplayer_setup.sh: line 9: rm-rf: command not found

I have fixed the spacing issue and re-uploaded the file.

Once I had done this I tried reinstalling again just to make sure it works and found out that it doesn’t work any more. The reason being the MPlayer package has been removed.

That being said I am going to be working on a new version of the Slackware Mplayer Installer but the first priority is to make the installer work. I will be swapping the essential codec pack to the all codec pack bundle. Then I will be working on making the installer more user friendly.

Hang tight, I should have it fixed soon. Once it is, I’ll make another post.

Fetching Online Data From Command Line

Filed under: Shell Script Sundays — TheLinuxBlog.com at 6:12 pm on Sunday, December 2, 2007

Shell Scripts can come in handy for processing or re-formatting data that is available from the web. There are lots of tools available to automate the fetching of pages instead of downloading each page individually.

The first two programs I’m demonstrating for fetching are links and lynx. They are both shell browsers, meaning that they need no graphical user interface to operate.

Curl is a program that is used to transfer data to or from a server. It supports many protocols, but for the purpose of this article I will only be showing the http protocol.

The last method (shown in other blog posts) is wget. wget also fetches files from many protocols. The difference between curl and wget is that curl by default dumps the data to stdout where wget by default writes the file to the remote filename.

Essentially the following do the exact same thing:

 owen@linux-blog-:~$ lynx http://www.thelinuxblog.com -source > lynx-source.html
owen@linux-blog-:~$ links http://www.thelinuxblog.com -source > links-source.html
owen@linux-blog-:~$ curl http://www.thelinuxblog.com > curl.html

Apart from the shell browser interface links and lynx also have some differences that may not be visible to the end user.
Both lynx and links re-format the code received into a format that they understand better. The method of doing this is -dump. They both format it differently so which ever one is easier for you to parse I would recommend using. Take the following:

 owen@linux-blog-:~$ lynx -dump http://www.thelinuxblog.com > lynx-dump.html
owen@linux-blog-:~$ links -dump http://www.thelinuxblog.com > links-dump.html
owen@linux-blog-:~$ md5sum links-dump.html
8685d0beeb68c3b25fba20ca4209645e  links-dump.html
owen@linux-blog-:~$ md5sum lynx-dump.html
beb4f9042a236c6b773a1cd8027fe252  lynx-dump.html

The md5 indicates that the dumped HTML is different.

wget does the same thing (as curl, links -source and lynx -source) but will create the local file with the the remote filename like so:

 owen@linux-blog-:~$ wget http://www.thelinuxblog.com
--17:51:21--  http://www.thelinuxblog.com/
=> `index.html'
Resolving www.thelinuxblog.com... 72.9.151.51
Connecting to www.thelinuxblog.com|72.9.151.51|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html][  <=>                                ] 41,045       162.48K/s
 
17:51:22 (162.33 KB/s) - `index.html' saved [41045]
 
owen@linux-blog-:~$ ls
index.html

Here is the result md5sum on all of the files in the directory:

 owen@linux-blog-:~$ for i in $(ls); do md5sum $i; done;
a791a9baff48dfda6eb85e0e6200f80f  curl.html
a791a9baff48dfda6eb85e0e6200f80f  index.html
8685d0beeb68c3b25fba20ca4209645e  links-dump.html
a791a9baff48dfda6eb85e0e6200f80f  links-source.html
beb4f9042a236c6b773a1cd8027fe252  lynx-dump.html
a791a9baff48dfda6eb85e0e6200f80f  lynx-source.html

Note: index.php is wget’s output.
Where ever the sum matches, the output is the same.

What do I like to use?
Although all of the methods (excluding dump) produce the same results I personally like to use curl because I am familiar with the syntax. It handles variables, cookies, encryption and compression extremely well. The user agent is easy to change. The last winning point for me is that it has a PHP extension which is nice to avoid using system calls to the other methods.

« Previous Page