satelliteuser083 Posted January 20, 2008 Report Share Posted January 20, 2008 I'd like to download an entire web-site for off-line perusal. Have, in fact, found many apps for doing this (e.g. Web Dumper, A1 Website Download), unfortunately all for Windoze. :sad: Can anyone help me out? Thanks. Quote Link to comment Share on other sites More sharing options...
skyhawk Posted January 20, 2008 Report Share Posted January 20, 2008 Taken from the wget man page: Wget can follow links in HTML and XHTML pages and create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as ‘‘recursive downloading.'' While doing that, Wget respects the Robot Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in downloaded HTML files to the local files for offline viewing. Quote Link to comment Share on other sites More sharing options...
scarecrow Posted January 20, 2008 Report Share Posted January 20, 2008 There are MANY spider software packages for Linux... an old and trusty one (it also has a Firefox plugin amd a KDE frontend) is httrack. Quote Link to comment Share on other sites More sharing options...
Greg2 Posted January 20, 2008 Report Share Posted January 20, 2008 I also suggest installing and using httrack, with or without the gui (khttrack) from the repos. wget with the mirror option will work. However I would suggest using the limit-rate option with it, so you don't over task the server bandwidth (that's not very friendly to do). It would be wget -m --limit-rate='amount' url-of-website The 'amount' can be in kilobytes (k), or megabytes (m). Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.