Problem: wget (version < 1.10) cannot download files larger than 2GB.
Background, context, limitations: I have a server at home running Debian Sarge with low bandwidth to the internet and a big harddisk. I want to use this computer as a downloading machine since it beeing up 24/7 means that the low-bandwidth does not really matter (I am not in a hurry).
The Sarge release of Debian has GNU wget version 1.9.1, which is not capable of downloading large (>2GB) files.
Possible solutions: upgrade wget to version 1.10 which can download large files. That can be done in at least two different ways:
1. add testing or unstable to the repositories that the server search for installable packages. Drawback: wget might pull in newer versions of essential libraries, which have not been tested together with binaries in Sarge. I want a stable server, that's why it runs stable.
2. make a custom backport by compiling wget 1.10 on the sarge system. Drawback: requires some manual installation of packages needed for the compilation and the compilation itself takes CPU-time (the server is a 486 DX 33 with 40 MB RAM).
Better, less disruptive, solution: Let another box that also runs 24/7 and have wget 1.10 installed do the download, and save the file on my server. This solution makes use of sshfs and requires nothing of my server.
Implementation:
Mount a directory of the target server over sshfs on the computer that has wget 1.10 installed. Run wget on the other box, instruct wget to save in the sshfs mounted directory.
$ sshfs code.cjb.net: mnt/ $ wget -c --limit-rate=28k -o mnt/debconf6-dvd-1.0.1-en-1-pal.iso -nd http://meetings-archive.debian.net/pub/debian-meetings/2006/debconf6/dvd/pal/debconf6-dvd-1.0.1-en-1-pal.iso
Here I include —limit-rate=28k since I didn't want the download to take up all the bandwidth.
If I want to temporary pause the download, I can use kill -STOP and kill -CONT the wget process. The "-c" option also makes it possible to terminate the wget process and continue the download at a later point.