solarian Posted April 28, 2006 Report Share Posted April 28, 2006 (edited) Hi! I want to download whole website so that I can later browse it offline. Is there a wget command that does this or any other software perhaps? Thanks! Edited April 28, 2006 by solarian Quote Link to comment Share on other sites More sharing options...
ianw1974 Posted April 28, 2006 Report Share Posted April 28, 2006 I think this might work: wget -r http://url_of_website/ I got this from an example in the manpage for wget. I'm just testing now to see if it works, and seems to be. Quote Link to comment Share on other sites More sharing options...
solarian Posted April 28, 2006 Author Report Share Posted April 28, 2006 :huh: :o how did I manage to miss that?!! thanks, Ian!! :) Quote Link to comment Share on other sites More sharing options...
jboy Posted April 28, 2006 Report Share Posted April 28, 2006 Terrific tip, Ian. It even downloaded the associated CSS and JavaScript files, the image files, and with the --convert-links options, it converted the links in the HTML files to local links for offline viewing. Great tip! Quote Link to comment Share on other sites More sharing options...
solarian Posted April 28, 2006 Author Report Share Posted April 28, 2006 it converted the links in the HTML files to local links for offline viewing. nah, it didn't, the links in that website simply were done like any good webdesigner does - configured for directory path only, i.e., /texts/articles/song.htm not http://website.com/texts/articles/song.htm Quote Link to comment Share on other sites More sharing options...
ianw1974 Posted April 28, 2006 Report Share Posted April 28, 2006 Glad it worked :phew: Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.