Not surprising. If you use a 1GB file, netburn has to fit the entire 1GB file in RAM. (Multiple times, if you're using concurrency.) Netburn is not really a very sensible tool for use with files that large.When, I'm trying to featch 1GB data file through netburn after some time, I see the below error.
"Out of memory". Have you ever faced the same issue please confirm?
Yes, because individual files are fetched at full rate. The rate limiting works by sleeping in between fetches, so that the overall rate over the length of the test meets the specification.2. I have limited my network speed to 58 Mbps and initialized netburn to fetch the data using default data rate.
Surprisingly, my throughput is shown as 85 Mbps in netburn stats. Is this an expected behavior?
I'm not sure what you're trying to accomplish, so it's hard to say whether netburn isn't the best tool for your particular task, or whether you'd be better off using it differently.