What's new

Wget stops downloading files at 2GB.

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

rudxai

New Around Here
Hey guys,
I am experiencing a problem lately.
Whenever i try to download with wget large files (e.g 30GB) to a usb connected external HDD, ntfs formated partition, the download always stops at 2GB. If only then I restart the download it will continue downloading the rest of it.

For example:
1. cd /mnt/data/stuff/
2. wget -bqc http://blahblahdotcom/file.mkv
3. Noticing wget not running after a while and file of 2GB present.
4. wget -bqc http://blahblahdotcom/file.mkv
5. File downloads completely.

Using AC86U, running the latest merlin image.

I didn't use to have this issue before, what happened!?
 
What filesystem on that disk?
 
@RMerlin
@rudxai
Also.... USB doesn't equal NAS.

NTFS shouldn't have this sort of issue but, wasting time and bandwidth on 2GB chunks to ultimately not get the whole file seems like a waste of resources. I've used some creative means to grab files in the past and wget wasn't one of them for large files but, there are some scrapers out there that work better.
 
Much to my surprise the OP is correct. He mentioned that he was using NTFS but the problem is the same with ext4.

The issue seems to be that wget is mis-detecting the filesize as 2147483647 bytes (#7FFFFFFF).

Rich (BB code):
# wget -c https://speedtest.wtnet.de/files/5000mb.bin
Will not apply HSTS. The HSTS database must be a regular and non-world-writable file.
ERROR: could not open HSTS store at '/root/.wget-hsts'. HSTS will be disabled.
--2022-03-31 19:13:23--  https://speedtest.wtnet.de/files/5000mb.bin
Resolving speedtest.wtnet.de... 213.209.106.95, 2a02:2028:ff00::f9:2
Connecting to speedtest.wtnet.de|213.209.106.95|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 2147483647 (2.0G) [application/octet-stream]
Saving to: '5000mb.bin'

5000mb.bin                              100%[=============================================================================>]   2.00G  9.16MB/s    in 4m 15s

2022-03-31 19:17:38 (8.04 MB/s) - '5000mb.bin' saved [2147483647/2147483647]

# ls -l 5000mb.bin
-rw-rw-rw-    1 admin    root     2147483647 Mar  8 02:12 5000mb.bin
 
@ColinTaylor

Now that's interesting. Definitely a router issue.

1648750997646.png
 
Googling "2147483647 wget" returns some interesting comments. Not that I fully understand them. Seems Wget needs an update maybe.
 
A quick Google suggests that this is/was a known issue with wget 1.21.1.

What is strange is that as the OP said, when issuing the same command again it correctly detects the size.
Rich (BB code):
# wget -c https://speedtest.wtnet.de/files/5000mb.bin
Will not apply HSTS. The HSTS database must be a regular and non-world-writable file.
ERROR: could not open HSTS store at '/root/.wget-hsts'. HSTS will be disabled.
--2022-03-31 19:43:22--  https://speedtest.wtnet.de/files/5000mb.bin
Resolving speedtest.wtnet.de... 213.209.106.95, 2a02:2028:ff00::f9:2
Connecting to speedtest.wtnet.de|213.209.106.95|:443... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 5120000000 (4.8G), 2972516353 (2.8G) remaining [application/octet-stream]
Saving to: '5000mb.bin'

5000mb.bin                              100%[++++++++++++++++++++++++++++++++=============================================>]   4.77G  9.55MB/s    in 5m 0s

2022-03-31 19:48:22 (9.46 MB/s) - '5000mb.bin' saved [5120000000/5120000000]

EDIT: Using the --ignore-length parameter seems to work.
Rich (BB code):
# rm 5000mb.bin
# wget --ignore-length https://speedtest.wtnet.de/files/5000mb.bin
Will not apply HSTS. The HSTS database must be a regular and non-world-writable file.
ERROR: could not open HSTS store at '/root/.wget-hsts'. HSTS will be disabled.
--2022-03-31 19:52:33--  https://speedtest.wtnet.de/files/5000mb.bin
Resolving speedtest.wtnet.de... 213.209.106.95, 2a02:2028:ff00::f9:2
Connecting to speedtest.wtnet.de|213.209.106.95|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: ignored [application/octet-stream]
Saving to: '5000mb.bin'

5000mb.bin                                  [                          <=>                                                 ]   4.77G  12.6MB/s    in 10m 6s

2022-03-31 20:02:39 (8.06 MB/s) - '5000mb.bin' saved [5120000000]
 
Last edited:
Yeah i didn't have this issue like..a couple firmware updates before. With the exact same setup i have now i used to flawlessly download 60gb single files.
I also found those comments about an outdated wget version but i wanted to ask here regardless.
I will try the --ignore-length parameter and let you know.

Thanks
 
UPDATE:
Ok so i tried with curl and had no problem (which narrowed it down to a wget problem)
and then i tried wget with --ignore-length option and worked as well!
So I am covered.

Thank you all for helping.
 
Yeah i didn't have this issue like..a couple firmware updates before.
That would indicate a new bug introduced in an wget update in that case. I'd have to see if there's a newer version with that bug fixed.
 
There is an interesting comment back in wget 1.20.3

/* Pick a strtol-compatible function that will work with wgint. The
choices are strtol, strtoll, or our own implementation of strtoll
in cmpt.c, activated with NEED_STRTOLL. */

In 1.21 and 1.21.1 it uses strtol.
in 1.21.2 and 1.21.3 it uses strtoll

1.21.3 seems to be the current release as of Feb 26, 2022.

This was in the changelog between the 1.21.1 Jan 9, 2021 and 1.21.2 Sep 7, 2021 releases
2021-04-11 Tim Rühsen <tim.ruehsen@gmx.de>

* src/wget.h: Use strtoll() for str_to_wgint
This fixes a regression reported at https://savannah.gnu.org/bugs/?60353.

Reported-by: Michal Ruprich

bug #60353: On 32-bit arm, wget is unable to download a file bigger than 2GiB
 

Latest threads

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top