Downloading files from the Internet is something quite common and normal in the daily use of a computer. People who use their computer in a basic way, download these files using a graphical interface. Usually, this graphical interface is within the same browser. And other users a little more experienced, use download managers but always with a graphical interface. But if we use a server, how can we do it? Or if we like to use the terminal, we could download a file through the terminal? the answer is yes. That is why we will explain the wget command to you.
GNU wget is a CLI (command-line interface) utility that allows you to download files from the internet if you know its link. It is very efficient in the management of computer resources because it is practically unnoticeable its use. In addition, this allows you to rename the downloaded files quickly.
On the other hand, although it is a program that is used in the terminal, it is quite complete. Here is a list of the main features of the application:
- Wget supports downloads through proxies
- IPv6 is fully supported by wget.
- It allows limiting the bandwidth used for downloads.
- Wget works with SSL/TSL to secure downloads.
Finally, wget is available for many UNIX systems but also for Windows. And obviously being developed by GNU, we are talking about an open-source application.
Using the Wget Command
Now we will explain its basic use. Certainly, wget is very simple to use. The first thing we need to know is that almost all Linux distributions have it installed by default. But if not, just invoke the package manager of your distribution and install it. I assure you that it will be in the official repositories.
The most basic way to use the command is as follows:
:~$ wget [file_link]
This will download the file in the directory where we are.
A huge advantage of wget is that we will be able to download multiple files whose links are in a text file. For example:
:~$ nano files.txt
https://osradar.com/file1.zip https://osradar.com/file2.tar https://osradar.com/file3.mp3
And to make wget able to download them all simultaneously just add the option -i and indicate the path of the text file with the links. If the file is in the same directory just put the name.
:~$ wget -i [text_file_path]
More options to use the Wget command
The option -c it is quite useful. If for any reason the download is interrupted we will be able to restart it in the size where it has remained. For example, if we download a 100Mb file and when we had 80Mb downloaded the download is interrupted, we don’t have to download everything again just run the command again.
:~$ wget -c [Link]
As I said before, wget can limit bandwidth usage. We define this with the –limit-rate option and assign it a value. For example:
:~$ wget --limit-rate=700K [link]
With this, the download will not exceed 700K.
If the download requires a password and a user name, we can also specify it:
:~$ wget --http-user=[user] --http-password=[password] http://osradar.com/hello.mp4
Obviously hello.mp4 is an example. There you have to put the link of the file.
By default wget makes 20 attempts to establish the connection and start downloading. If your internet connection is bad, you can increase that number with the -t option.
:~$ wget -t 60 [link]
Wget also allows download via FTP with the same syntax as above. If you need a password and user, you can specify it as follows:
:~$ wget --ftp-user=[user] --ftp-password=[password] ftp://osradar.com/example.tar
Finally it is possible that wget works in the background. For this there is the option -b.
:~$ wget -b [link]
It is useful to download big files. It should be noted that all the options mentioned in the post can be used together.
And that is it.
Conclusion
Using servers requires knowledge of many different commands. Today you have learned how to use in a basic way the wget command that will allow you to download files to a computer using the terminal. It is also useful if you like to use the terminal or in low resources equipment.