Jump to content

scheduled downloads


Recommended Posts

Hello!

 

I have a remote webserver and a home computer.

My webserver on request generates a sql.gz backup file for each individual database on it.

How can I set up in Mandrake (software or some other code) that on scheduled times my home computer with username and password accesses my webserver, requests and downloads the sql.gz file (it's the same URL at all times for an individual database).

It would be good if the program auto renames the sql file by adding a date to it, or at least overwrites the existing backup file.

 

Is there a program I can use to do this?

 

 

thanx!

 

p.s. I'm not a programmer so I don't know how to code it myself.

 

[moved from Software by spinynorman]

Link to comment
Share on other sites

So, you need to perform the following steps:

 

1. Log in to the server (This is the missing part)

2. Download the file

3. Rename the file, adding the current date.

4. Do this at a set interval

 

To tackle these in reverse order, step 4 can be accomplished using cron - you add a line to the crontab file that says what to run and when. See http://www.scrounge.org/linux/cron.html for more info. What you would run is a bash script, containing steps 1 to 3.

 

Step 3 is certainly possible using Linux's shell scripts. You can get the current date and time as a variable and rename a file. In a bash script, you can do something like this...

 

FILENAME=your file's local filename

DATE=$(date +%G%m%d)

mv $FILENAME $FILENAME$DATE

 

The cryptic DATE syntax is something I cribbed off the web and returns a date and time in this format: 20050217 (2005, month 02, day 17). So if the original filename is "foo", then the renamed file will be "foo20050217". http://archive.lug.boulder.co.us/Week-of-M...011/028455.html

 

For step 2, the program "wget" allows you to get a file from a http server. See http://www.linuxforum.com/man/wget.1.php which is basically the man page.

 

However, I can't immediately see a way to perform step 1 - log in to a web server. Proxy server, yes, wget handles that nicely, but I presume you mean something more like the "login" page here at MandrakeUsers.org.

 

If you can get at the same files via ftp instead, then wget can submit a username and password to an ftp server easily too.

 

Anyone else have any ideas?

 

(Disclaimer: All of this is just off the top of my head and untested.)

Link to comment
Share on other sites

If your not bothered about changing the file name and simply want to over write the existing backup, rsync is the simple answer.

rsync -au -e ssh remote_server:/directory/ /destination_directory

Set it up as a scheduled cron job and thats it done. !!!

Link to comment
Share on other sites

I would use cron (/etc/crontab) to schedule the running of wget, followed by the running of mv.

I wouldn't do it like that - how long after wget would you schedule the mv for? Since mget is a network operation, it is subject to the vagaries of bandwidth and could take a long time to complete.

 

I suppose you could "mv" the OLD version, to rename it, then start wget afterwards. That would be less error-prone.

 

I'd still go with a shell script though. With a shell script, you can do the following:

 

- Know that all the items will execute in sequence

- Keep it in your own folder, owned by you, for easy editing.

- Have access to all the shell scripting goodness for when you want to do something more sophisticated with it.

 

Just IMHO, of course...

Link to comment
Share on other sites

I would use cron (/etc/crontab) to schedule the running of wget, followed by the running of mv.

I wouldn't do it like that - how long after wget would you schedule the mv for? Since mget is a network operation, it is subject to the vagaries of bandwidth and could take a long time to complete.

 

I suppose you could "mv" the OLD version, to rename it, then start wget afterwards. That would be less error-prone.

 

I'd still go with a shell script though. With a shell script, you can do the following:

 

- Know that all the items will execute in sequence

- Keep it in your own folder, owned by you, for easy editing.

- Have access to all the shell scripting goodness for when you want to do something more sophisticated with it.

 

Just IMHO, of course...

I meant like this:

 wget ftp://file && mv file

That way it won't move until the download is finished ;)

 

But I now agree with anon that rsync is the way to go.

Link to comment
Share on other sites

How about ncftpget

 

#!/bin/bash

FILENAME=<<remotefile>>
DATE=$(date +%G%m%d)
ncftpget -u <<username>> -p <<password>> /home/solarian $FILENAME && mv /home/solarian/$FILENAME /home/solarian/$FILENAME$DATE

 

If you want to delete the remote-file after a successful download:

ncftpget -DD <<rest of the stuff>>

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

×
×
  • Create New...