How to make a dump of a website: Difference between revisions
Jump to navigation
Jump to search
mNo edit summary |
|||
(5 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
=update 20210611 Best Version:= | |||
wget -m https:// | |||
=original= | |||
If you want to make a dump of a website, then you can do: | If you want to make a dump of a website, then you can do: | ||
/usr/bin/wget -q --convert-links -E -m http://website.nu.com.mp3 | /usr/bin/wget -q --convert-links -E -m http://website.nu.com.mp3 | ||
wget --progress=bar --convert-links -E -m ftp://ftp.studer.ch/Public/ | wget --progress=bar --convert-links -E -m ftp://ftp.studer.ch/Public/ | ||
Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to -p: | Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to -p: | ||
wget -E -H -k -K -p http://<site>/<document> | wget -E -H -k -K -p http://<site>/<document> | ||
-E adjust extention ??? | |||
-H also download from foreign hosts | |||
-p download all page requisites | |||
'''PvdM version:''' | |||
wget -r -k -K -p http:// | |||
-r recursive | |||
-k convert links | |||
-K backup converted |
Latest revision as of 15:28, 11 June 2021
update 20210611 Best Version:
wget -m https://
original
If you want to make a dump of a website, then you can do:
/usr/bin/wget -q --convert-links -E -m http://website.nu.com.mp3
wget --progress=bar --convert-links -E -m ftp://ftp.studer.ch/Public/
Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to -p:
wget -E -H -k -K -p http://<site>/<document>
-E adjust extention ???
-H also download from foreign hosts
-p download all page requisites
PvdM version:
wget -r -k -K -p http://
-r recursive
-k convert links
-K backup converted