Bookmark and Share

Monday, May 23, 2016

Howto: create a large file downloader with resume feature

If you need to create a downloader program for the download of a single large file with the resume feature, here is a solution :
WGET  + WINRAR
This is the strategy: use wget to download the file, use winrar to package the wget and dependencies files and to call wget after the extraction.
  1. Download wget and the dependencies from here, GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.
    This is THE tool if you need to download big files.
  2. Copy the files inside a directory , i.e. : “Install”
  3. Install WinRar
  4. Create a new WinRar Archive with these files and settings :
  5. image
  6. Click on the SFX icon in order to create a SelfExtracting archive, and click OK:
    image
  7. Change the Setup command, we need to instruct wget on the file we need to download:image
    1. the –c parameter continues getting a partially-downloaded file
    2. the –O parameter (uppercase!) sets the path and the file name of the downloaded file.
  8. Run the program and the download will start :
    image
  9. If you close and re-launch the program the download will resume gracefully
  10. Use WinRar and not 7Zip for the auto-extracting executable because 7zip doesn’t support windows variables like “%USERPROFILE%”.
    This is important if you need to control where is downloaded the file.
  11. This solution works well if you need to download from the web really huge files : virtual machine images or big archives.
Hope it helps!

No comments:

Post a Comment