Jump to content

Help with a script?


Recommended Posts

So here is how I want to work... using BASH

 

Start the script...

 

1. fetch a file using wget

2. run another program to process the file I got from wget

3. run sftp to upload the result

 

Any suggestions and pointers?

 

I know how to run programs through BASH... What I want is: How to make the script wait until wget finishes, how to make it wait until the processing program finishes and how to get sftp to work with it. I know there is delay until the server asks for a password, and there are local and remote commands.

 

Any help welcomed and appreciated

Link to comment
Share on other sites

If your run all your programs (wget, proccess program, and sftp) in the foreground, there will be no problem at all. The script wont run anything else untill the current proccess is finished.

 

If you give us more details on what are you trying to do, we'll be able to give more concise help :)

Link to comment
Share on other sites

Moonchild, putting & at the end of each command means sending them into the background, which is AGAINST my advice.

 

aru knows much more about it than me, but if I were doing it, I'd end up using &&, but that might be because I'm braindead.

 

I would also do it using && (with the backslash) at the end of each command, which means that if and only if the previous command went right (return status 0) then run the following command.

 

Note about wget: wget returns 0 when it downloads something, and non 0 when it is unable to download, and that 'something' could be an error page. So I would make checks on the document you downloaded to see if it is right instead of a symple "&&".

 

The script I posted here does things pretty close to the ones you want, so maybe you'd like to take a look to it to get some ideas ;)

Link to comment
Share on other sites

aru...

 

thanks for your reply... I will do the && trick...

 

I managed to get the script up and running (finally!!!)

 

Of course, I only have minimal error checking for all now but here is the idea...

 

1. Perl script on the web collects data...

 

2. A bash script:

a. Runs wget to retrieve data

b. If new data are found, the backup old marker file and create a new

c. Render a new image

d. Run an expect script to connect with sftp to the server:

i. Use an SFTP batch file to update necessary things

e. exit

 

3. A cron job to run the bash script every 2 hours, every day.

 

Wow me! I don't believe I did this!

 

Anyone want to check what it does?

 

http://students.washington.edu/michalis/ge...aphia/guestMap/

 

This is an early alpha version of the website, don't do any stress testing please?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

×
×
  • Create New...