Jump to content

ftp directory checker


Guest SpaceCadet
 Share

Recommended Posts

Guest SpaceCadet

does anyone know of an ftp client that logs onto a certain directory at given intervals, like every 10 mins, to check to see if any changes have been made to the dir? I need to know because, we sit here and throughout the day we allways have to be around to log onto the ftp site and check to see if any new files have been uploaded. It would be great if there were some kind of daemon that could do it automatically and then e-mail me a notifier saying that there has been files/folders added to the specified dir. Thanks!

Link to comment
Share on other sites

Some ideas:

 

http://isp-lists.isp-planet.com/isp-linux/...4/msg00848.html

http://www.linuxquestions.org/questions/ar...2003/07/3/73950

http://www.experts-exchange.com/Operating_...Q_20647308.html

 

The ftp script could be

 

user username password

prompt

ls > log.'date +%m%d%Y'

quit

 

Not sure if it works 'tho.

 

The rest of the crond job could compare the old log.txt with the new one and warns you when something changes. You could just sleep while waiting for a 'beep' .. lucky you. I wanna job like that!!!

 

well .. this is just an idea :lol:

 

MOttS

Link to comment
Share on other sites

It could be very easy, just developing a bit more the idea posted by MottS you could make this simple script (written on the fly, not tested, but I hope it is enough commented to show how it would work):

 

#! /bin/bash

# checks changes on remote ftp dir.



FTPDIR=ftp://ftp.somewhere.com/path/to/dir



# File where previous listing was stored:

PREV_LIST=~/prevlist.log



# Temporary list

TEMP_LIST=~/.list.tmp



# function to retrieve the remote listing

getRemoteListing () {

   # dumps remote ftp listing to stdout.

   # If you want a particular output you'll need to proccess it before dumping

   # it out.

   FTP=${1} # Parameter 1 must to be the ftp url.

   dumbstring='aaabbbccc_ddd*' # needs the '*'

   list=".listing"

   # we'll use wget, see it's man page to see how to provide user/passwd params.

   wget -q --dont-remove-listing $FTP/$dumbstring

   if [ -f ${list} ]; then

       cat ${list} && rm ${list}

   else 

       echo "Error, I can't retrieve the listing"

       return 1 # error

   fi

}



compareListings () {

   # Parameter 1 must be the old listing

   # Parameter 2 is the new listing

   # Modify this function if you want more information (sizes, files removed...)

   newfiles=$(diff $1 $2 | sed -n '/^>/ {s|^> ||p;}')

   removedfiles=$(diff $1 $2 | sed -n '/^</ {s|^< ||p;}')

   # A bit of formated output

   if [ "${newfiles}" != "" ]; then

       echo "Files uploaded since last log:"

       echo "$newfiles" | awk '{printf "t"$NF"n"}'

   elif [ "${removedfiles}" != "" ]; then

       echo "Files deleted since last check:"

       echo "$removedfiles" | awk '{printf "t"$NF"n"}'

   else

       echo "No changes in remote dir since last log."

   fi

}



# Checks if this is the first run of the script, if no previous listing is

# loged, then we'll create it and exit to wait until the next run of the script:

if [ ! -f ${PREV_LIST} ]; then

   getRemoteListing $FTPDIR > $PREV_LIST || exit 1

   echo "First run of the script, the listing has been stored on $PREV_LIST; exiting" >&2

   exit 0

fi



# Now lets procced, first downloading the listing and later comparing and

# presenting the results.



getRemoteListing $FTPDIR > $TEMP_LIST || exit 1

compareListings $PREV_LIST $TEMP_LIST 



# lets save the listing for the next run of the script:

cp $TEMP_LIST $PREV_LIST && rm $TEMP_LIST

 

Just tell cron to run it every ten minutes, the output will be automaticall send to crons owner via mail (check man cron for more info)

 

HTH

Link to comment
Share on other sites

ARU ... You have the BA$H MASTER!!!!!!!!!!

 

Your script works like crazy man! Didn't know wget was able to list directories on remove FTP site. I thought it was only for HTTP stuff. This is why I suggested the 'ftpscript.txt' thing. Whatever it works!!!!!!!!

 

Lucky SpaceCadet.

 

MOttS

Link to comment
Share on other sites

thanks guys! but that was just simple stuff ;)

 

about the wget trick, is something I discovered long time ago (bvc, you might remember my attempt to satisfy rpm dependences from rpmfind.net in the old board). If you query wget with a non-matching string plus a '*', wget will retrieve the full ftp listing before attempting to download matches of the query (nothing else will be downloaded since wget won't find any match). Then the trick is to save that listing for further usage :mrgreen:

Link to comment
Share on other sites

Guest SpaceCadet

Aru u da MAN! The only problem tho is that I don't know how to use a script.. LOL If you can package that into some kind of program for me I might be able to pay you though :) It would have to be able to check a certain DIR + all it's sub-dir's for any traffic. It would need a basic interface, like:

 

+++++++++++++++++++

FTP ADDY: | | +

UserName: | | +

Password: | | +

DIR: | | +

E-Mail Addy: | | +

Schedular: +

o Per Minute +

o Per Hour +

+++++++++++++++++++

 

Don't go nuts making it just yet.. we would have to discuss how much it would cost and I would ahve to get it authorized, but soon when I get paid I might even buy it myself. Anyway toodles!

Link to comment
Share on other sites

SpaceCadet, on your Linux box, put the Aru's script in your home (let's call it /home/SpaceCadet/Aruscript) and make sure it is executable (as specified by ab2ms). Now open a console, become root (type 'su' and enter your password) and configure the system so that the Aru's script is run every 10 minutes. To do so, open the crond file (type 'crontab -e') and enter the following line into it:

*/10 * * * * /home/SpaceCadet/Aruscript

Now save it (press Esc and type ':wq'). Make sure crond is running by typing

service crond status

The output should be obvious if it is running. Now the only thing you have to do is to modify the script a little so that it email you something if any new files have been uploaded. But you know Aru now .... :lol:

 

HTH

 

MOttS

Link to comment
Share on other sites

Hey, I have a simpler idea. Instead of sending you an email when there is a new file (or one was deleted), why not just opening Konqueror (or Mozilla or whatever) to the FTP site. This way the user would be warn in real time! No need to check mails!

 

So instead of

    if [ "${newfiles}" != "" ]; then 

        echo "Files uploaded since last log:" 

        echo "$newfiles" | awk '{printf "t"$NF"n"}' 

    elif [ "${removedfiles}" != "" ]; then 

        echo "Files deleted since last check:" 

        echo "$removedfiles" | awk '{printf "t"$NF"n"}' 

    else 

        echo "No changes in remote dir since last log." 

    fi

 

What about

    if [ "${newfiles}" != "" ]; then 

	 konqueror $FTPDIR &

    elif [ "${removedfiles}" != "" ]; then 

	 konqueror $FTPDIR &

    fi

 

The browser (Konq in my case) could even put in a variable on top of the script.

 

.. my 2 cents since I'm unable to modify the script so that it sends mails automatically :-(

 

MOttS

Link to comment
Share on other sites

... I'm unable to modify the script so that it sends mails automatically :-(

 

Man, that's another tricky part ;) Since the script is run through cron, EVERY OUTPUT (stdout/stderr) will be sent to you via email (to the email of the user who has set the cron job); just check it yourself, put in your crontab the following entry:

 

* * * * * (echo 'hello'; date)

 

It should email you the output every minute.

 

If not works the way it should, try to append a 'MAILTO = username' on top of your crontab.

 

SpaceCadet, you'll find more info about how cron works in the man pages: crontab(8), crontab(5), and cron(1)

 

Another way to send all the output to a mail address could be:

 

#! /bin/bash

temp_mail=~/.tempmail

mailaddress=aru@somewhere.org

# redirecting both stdout and stderr to the tempmail file

exec &> $temp_mail



[...PUT MY SCRIPT HERE...]



# after the script ends, send the result to $mailaddress

cat $temp_mail | mail -s "$0 output" $mailaddress

rm $temp_mail

 

or even better, trap all exits with a call to mail, so every single event will be notified to you... but is silly to do it this way , since cron will do the job for you w/o further complications.

 

HTH

Link to comment
Share on other sites

Guest SpaceCadet

OK I will follow your instructions. Just so you know I am not a programmer though so all that code means nothing to me except for the step by step instructions you have given so far :D One question though, what kind of file do I put that code into is it just a simple text file? So I copy it from the website and place it into what? And then what do I name the file? I dunno if I'm assuming that this kind of thing works like windows where in order for a script to be run it needs to be inside of some kind of file, or if it's some setting somewhere saved in terminal or what lol. As soon as I figure that part out I'm good :)

Link to comment
Share on other sites

OK I will follow your instructions.  Just so you know I am not a programmer though so all that code means nothing to me except for the step by step instructions you have given so far :D 

I'm not a programer either :lol: , bash scripting will become natural to you after a while using the terminals ;)

 

One question though, what kind of file do I put that code into is it just a simple text file?
Yep, in linux almost everything goes into a plain text file. It doesn't need to have even an extension since it includes the shebang (the call to the interpreter in the first line of the script); but if you wish you can name it "somename.sh"

 

So I copy it from the website and place it into what?
Just copy and paste into some text editor of your choice, I personaly like vim.

 

And then what do I name the file?
Whatever name you whish, ie: ftpchecker

 

Follow the instructions above from ab2m and Motts on how to make it executable and how to set cron to run it every lap of time you wish

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

×
×
  • Create New...