How to make a dump of a website: Difference between revisions

From www.ReeltoReel.nl Wiki
Jump to navigation Jump to search
mNo edit summary
mNo edit summary
Line 10: Line 10:


-E adjust extention ???
-E adjust extention ???
-H also download from foreign hosts
-H also download from foreign hosts
-p download all page requisites
-p download all page requisites



Revision as of 18:28, 9 September 2017

If you want to make a dump of a website, then you can do:

/usr/bin/wget -q --convert-links -E -m http://website.nu.com.mp3
wget --progress=bar --convert-links -E -m ftp://ftp.studer.ch/Public/


Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to -p:

wget -E -H -k -K -p http://<site>/<document>

-E adjust extention ???

-H also download from foreign hosts

-p download all page requisites

PvdM version:

wget -r -k -K -p http://

-r recursive

-k convert links

-K backup converted