How to make a dump of a website update 20210611 Best Version: wget -m  https:// original If you want to make a dump of a website, then you can do: /usr/bin/wget -q --convert-links -E -m  http://website.nu.com.mp3 wget --progress=bar --convert-links -E -m  ftp://ftp.studer.ch/Public/ Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to -p: wget -E -H -k -K -p  http:// / -E adjust extention ??? -H also download from foreign hosts -p download all page requisites PvdM version: wget -r -k -K -p  http:// -r recursive -k convert links -K backup converted