No attachments for this post
When diving into the world of Linux, one tool you will likely encounter is wget. This small, yet powerful command-line tool is indispensable for anyone looking to perform non-interactive downloading of files from the internet. In this blog post, we will unravel the mysteries of wget, showcasing what it does and providing a few practical examples of its usage.
What is wget?
wget stands for "World Wide Web get". It's a free utility available for most UNIX-based systems (like Linux and macOS) that allows users to download files from the internet. Unlike web browsers that require a graphical user interface, wget can download files in the background or even when the user isn't logged in. This makes it particularly useful for downloading large files, mirroring websites, or automated scripts.
Key Features of wget
- Non-Interactive Downloading: As mentioned, wget can download files without user intervention.
- Support for Various Protocols: wget supports HTTP, HTTPS, and FTP protocols.
- Recursive Download: With the right options, wget can follow links and download entire websites.
- Resume Downloads: If a download gets interrupted, wget can pick up where it left off.
- Limit Bandwidth: You can set the download rate, ensuring you don't hog all the bandwidth.
- Proxy Support: If you're behind a proxy server, wget can still get the job done.
Examples of wget in Action
Downloading a single file is straightforward. For instance, to download a sample PDF file:
Specify Output Filename
If you wish to save the downloaded file with a different name, use the -O option.
wget -O mydocument.pdf https://www.example.com/sample.pdf
Downloading Multiple Files
Create a text file containing multiple URLs, each on a new line. Use wget to download them all.
wget -i url-list.txt
Recursive Website Download
To download an entire website, use the -r (recursive) flag. However, do this responsibly to avoid overloading servers.
wget -r https://www.example-website.com/
If you don't want wget to use all available bandwidth, use the --limit-rate option.
wget --limit-rate=200k https://www.example.com/largefile.zip
Resuming Interrupted Downloads
If a download is interrupted, use the -c flag to continue.
wget -c https://www.example.com/largefile.zip
The wget command is an invaluable tool in the Linux ecosystem for downloading files. Its non-interactive nature and wide array of features make it versatile for many scenarios. Whether you're just fetching a single file, mirroring an entire site, or managing downloads in scripts, wget has you covered.
For a deeper dive, you can always refer to the man page by typing man wget in the terminal. Remember, with great power comes great responsibility, so always use tools like wget in a respectful and ethical manner.
Comments on this post
No comments have been added for this post.
You must be logged in to make a comment.