Skip to content

Instantly share code, notes, and snippets.

@jasenmichael
Forked from pmeinhardt/download-site.md
Created June 1, 2021 01:08
Show Gist options
  • Save jasenmichael/534b5976984f5b4f14825ec76df9cb6d to your computer and use it in GitHub Desktop.
Save jasenmichael/534b5976984f5b4f14825ec76df9cb6d to your computer and use it in GitHub Desktop.
download an entire page (including css, js, images) for offline-reading, archiving… using wget

If you ever need to download an entire website, perhaps for off-line viewing, wget can do the job — for example:

$ wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains website.org --no-parent  www.website.org/tutorials/html/

This command downloads the website www.website.org/tutorials/html/.

The options are:

  • --recursive: download the entire website
  • --domains website.org: don't follow links outside website.org
  • --no-parent: don't follow links outside the directory tutorials/html/
  • --page-requisites: get all the elements that compose the page (images, css and so on)
  • --html-extension: save files with the .html extension
  • --convert-links: convert links so that they work locally, off-line
  • --restrict-file-names=windows: modify filenames so that they will work in Windows as well
  • --no-clobber: don't overwrite any existing files (used in case the download is interrupted and resumed).

Source: http://www.linuxjournal.com/content/downloading-entire-web-site-wget

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment