Use wget command - Linux

wget:

Download just one file
wget https://www.site.com/images/my_image.jpg

 

Specify the folder where to save the downloaded fil
wget -P /home/user https://www.site.com/images/my_image.jpg

 

Save the file with a different name
wget -O another-name.jpg https://www.site.com/images/my_image.jpg

 

Ignore SSL Certificate Errors:

wget --no-check-certificate https://www.site.com

 

Limit the bandwidth
wget --limit-rate=20k http://www.site.com

 

limit rate and run it in background
wget -cb --limit-rate=25K http://www.site.com/name.of.file
-c will continue any interrupted download from the place it was left. -b will start wget in background (the PID will be informed when it starts) –limit-rate will limit the speed or bandwidth available for wget.

 

Resume interrupted download
wget -c https://www.site.com/images/my_image.jpg

 

Download files in a batch (from a file that has all urls)
wget -i files-to-download.txt

 

Create a local copy of a site for local browsing
wget -m -k -E -p -np https://www.site.com/

-m Makes a mirror copy of the site
-k Convert all links to relative, so they can be used locally
-E Add extensions to extensionless files in order to work, some pretty uri sites does not show the extension of the files
-p Download all page requisites like jss css files, otherwise the site will not work properly
-np Limits the download to only go down in folders, does not follow links to upper folders, if you want to download just a portion of the site this comes handy.

 

 

 

 

Add comment


Security code
Refresh

Category: