wget:
Pros:
- Simple to use for downloading files
- Can download files recursively from a directory
- Supports resuming interrupted downloads
- Can download files in the background
- Supports HTTP, HTTPS, and FTP protocols
Cons:
- Limited to file downloading tasks
- No support for uploading files
- Less flexible compared to curl
curl:
Pros:
- More versatile than wget
- Supports a wide range of protocols (HTTP, HTTPS, FTP, SFTP, SCP, LDAP, etc.)
- Can upload and download files
- Supports data manipulation (POST, PUT, DELETE, etc.)
- Can be used for API testing and interaction
- Supports cookies, authentication, and custom headers
Cons:
- More complex command-line options
- No built-in support for downloading files recursively
- GET:
curl https://api.example.com/data
- POST:
curl -X POST -H "Content-Type: application/json" -d '{"key": "value"}' https://api.example.com/data
- Save output to file:
curl -o output.txt https://api.example.com/data
- Authentication headers:
curl -u username:password https://api.example.com/secure-data
Download and run a script:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
curl https://example.com/documents | grep -oP 'https://example.com/documents/\K[^"]+\.pdf' | xargs -I {} curl -O https://example.com/documents/{}
Keep in mind that this approach has some limitations and may not work for all websites, especially those that use JavaScript to load content or have complex URL structures. In such cases, you might need to use more advanced tools like wget or web scraping libraries in programming languages like Python or JavaScript.
- GET:
wget https://api.example.com/data
- Save output to file:
wget -O newfile.txt https://example.com/file.txt
- Save output to dir:
wget -P /path/to/directory https://example.com/file.txt
- Authentication headers:
wget --user=username --password=password https://example.com/secure-file.txt
- To download multiple files:
wget -i urls.txt
- download all files of a specific type from a website, use the -r flag for recursive download and the -A flag to specify the file type:
wget -r -A .pdf https://example.com/documents