How to make a dump of a website

From www.ReeltoReel.nl Wiki
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

update 20210611 Best Version:

wget -m https://

original

If you want to make a dump of a website, then you can do:

/usr/bin/wget -q --convert-links -E -m http://website.nu.com.mp3
wget --progress=bar --convert-links -E -m ftp://ftp.studer.ch/Public/

Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to -p:

wget -E -H -k -K -p http://<site>/<document>

-E adjust extention ???

-H also download from foreign hosts

-p download all page requisites

PvdM version:

wget -r -k -K -p http://

-r recursive

-k convert links

-K backup converted