Slow Donwloads. 6 Mbit. Should take 2 hours

What can be done ?
The download starts fine but it loses speed.

I’m asking my friends to help me download the ISO :s

TY for your attention. BD


binarydepth

binarydepth’s Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

Hi
Find a mirror close to your location from the following URL and use
that.
http://mirrors.opensuse.org/list/13.1.html


Cheers Malcolm °¿° SUSE Knowledge Partner (Linux Counter #276890)
openSUSE 13.1 (Bottle) (x86_64) GNOME 3.10.1
If you find this post helpful and are logged into the web interface,
please show your appreciation and click on the star below… Thanks!

malcolmlewis’s Profile: http://forums.opensuse.org/member.php?userid=740
View this thread: http://forums.opensuse.org/showthread.php?t=498991

Hi
Sorry, thought you were after the openSUSE dvd. So it’s an image you
created on SUSE Studio? How big is the file?


Cheers Malcolm °¿° SUSE Knowledge Partner (Linux Counter #276890)
openSUSE 13.1 (Bottle) (x86_64) GNOME 3.10.1
If you find this post helpful and are logged into the web interface,
please show your appreciation and click on the star below… Thanks!

malcolmlewis’s Profile: http://forums.opensuse.org/member.php?userid=740
View this thread: http://forums.opensuse.org/showthread.php?t=498991

malcolmlewis;2650123 Wrote:[color=blue]

Hi
Sorry, thought you were after the openSUSE dvd. So it’s an image you
created on SUSE Studio? How big is the file?[/color]

2 GB I really need a ISO with everything :confused:


binarydepth

binarydepth’s Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

binarydepth;2650125 Wrote:[color=blue]

2 GB I really need a ISO with everything :/[/color]
Hi
Try wget rather than a browser eg;

Code:

wget -c https://susestudio.com/download/<lots_of_numbers_and_characters>/<your_image>


Else I guess their site is just running slow…


Cheers Malcolm °¿° SUSE Knowledge Partner (Linux Counter #276890)
openSUSE 13.1 (Bottle) (x86_64) GNOME 3.10.1
If you find this post helpful and are logged into the web interface,
please show your appreciation and click on the star below… Thanks!

malcolmlewis’s Profile: http://forums.opensuse.org/member.php?userid=740
View this thread: http://forums.opensuse.org/showthread.php?t=498991

malcolmlewis;2650126 Wrote:[color=blue]

Hi
Try wget rather than a browser eg;[color=green]

[/color][/color]
Code:
--------------------[color=blue][color=green]
[/color]
wget -c https://susestudio.com/download/<lots_of_numbers_and_characters>/<your_image>
[/color]
--------------------[color=blue][color=green]
[/color]
Else I guess their site is just running slow…[/color]

Facepalm… Thanks :slight_smile: hahhahaa


binarydepth

binarydepth’s Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

Code:

wget -r 1000 -T 300 https://susestudio.com/download/

I suppose this results in 1000 tries with a 5 min interval, if it fails
more tha that then it’s impossible.


binarydepth

binarydepth’s Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

On Thu, 26 Jun 2014 19:26:02 +0000, binarydepth wrote:
[color=blue]

Code:

wget -r 1000 -T 300 [url]https://susestudio.com/download/[/url]...

I suppose this results in 1000 tries with a 5 min interval, if it fails
more tha that then it’s impossible.[/color]

FWIW, downloads here are not too bad; when I’m having trouble with a
larger download, I try using aria2 instead and do a parallel download
(aria2 does a good job of segmenting a larger download and retrieving it
in multiple parts, assembling it into the original as it goes -
regardless of protocol, generally).

You might give that a try. Something like:

aria2c --max-connection-per-server=4 --min-split-size=1M [url]

That should cause it to do 4 simultaneous downloads, and download the
file in 1 MB chunks.

There are other options that may help as well, and it can also be used to
restart an aborted download (see ‘-c’ in the help for details) so you
don’t have to start over every time the download fails.

Jim


Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner

Jim Henderson;2650947 Wrote:[color=blue]

On Thu, 26 Jun 2014 19:26:02 +0000, binarydepth wrote:
[color=green]

Code:

wget -r 1000 -T 300 [url]https://susestudio.com/download/[/url]...

I suppose this results in 1000 tries with a 5 min interval, if it[/color]
fails[color=green]
more tha that then it’s impossible.[/color]

FWIW, downloads here are not too bad; when I’m having trouble with a
larger download, I try using aria2 instead and do a parallel download
(aria2 does a good job of segmenting a larger download and retrieving
it
in multiple parts, assembling it into the original as it goes -
regardless of protocol, generally).

You might give that a try. Something like:

aria2c --max-connection-per-server=4 --min-split-size=1M [url]

That should cause it to do 4 simultaneous downloads, and download the
file in 1 MB chunks.

There are other options that may help as well, and it can also be used
to
restart an aborted download (see ‘-c’ in the help for details) so you
don’t have to start over every time the download fails.

Jim


Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner[/color]

I changed command to :
Code:

wget -c -T 150 --tries=1000

That’s great program. It’s the way to go when downloading large files.
:smiley:

Thanks!


binarydepth

binarydepth’s Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

It’s doe now. Didn’t take note of the “try” number :p, sorry if you are
curious. Will be testing tomorrow at most.
BD


binarydepth

binarydepth’s Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

I used 12 connections in 10MB packges while using WGET and ARIA won the
race. :stuck_out_tongue:

Code:

aria2c --max-connection-per-server=12 --min-split-size=10M

(1024*2)/64=32MB per Split.

(1024A)/B4, where A = File size, and B = Max of connections allowed.

What do you think of that model ?

Code:

aria2c --max-connection-per-server=16 --min-split-size=32M

Cheers :slight_smile:


binarydepth

binarydepth’s Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

On Fri, 27 Jun 2014 20:06:01 +0000, binarydepth wrote:
[color=blue]

What do you think of that model ?[/color]

Ultimately, I think if the model maxes out your connection speed, it’s a
good model. :slight_smile:

Jim


Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner

Jim Henderson;2651105 Wrote:[color=blue]

On Fri, 27 Jun 2014 20:06:01 +0000, binarydepth wrote:
[color=green]

What do you think of that model ?[/color]

Ultimately, I think if the model maxes out your connection speed, it’s
a
good model. :slight_smile:

Jim


Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner[/color]

Aria should tell that if it doesn’t. Don’t you think ?

Of course many CLI users have enough common sense. I’m biased to
precision but Aria would stop if the download already finished.


binarydepth

binarydepth’s Profile: http://forums.opensuse.org/member.php?userid=80573
View this thread: http://forums.opensuse.org/showthread.php?t=498991

On Sat, 28 Jun 2014 03:56:02 +0000, binarydepth wrote:
[color=blue]

Aria should tell that if it doesn’t. Don’t you think ?[/color]

It’s difficult to judge how much bandwidth is available. TCP doesn’t
work like that - it has some built-in dynamic throttling based on whether
or not a request is sent and a response isn’t received, but there’s no in-
built way for the software to ask “how fast is my connection?”. That’s
why (for example) torrent speed throttling by the client is inexact; it
operates by denying the inbound data a response, so the sender will
throttle back on how fast the data is being sent in order to cut down on
retransmissions.

It also depends on how many concurrent connections the server is
configured to permit overall and per client. You might want to open 12
connections to the server, but if the server only permits 4 per client,
then you’re not going to get an optimal speed over 12 connections.
Similarly, if the server is configured to limit the amount of outbound
data being sent to an individual connection or to a specific client,
that’s also a factor.

As are the network links between you and the server. I guarantee you
that if you have a 30 Mbps connection (as I do) and you connect to a
server that’s got 10 Mbps or 100 Mbps worth of bandwidth available to it,
but there’s a high-latency 56 Kbps link between you and the server,
you’re not going to max out your connection. :wink:

The same is true even if you have a full 30 Mbps between you and the
server, unless you have a dedicated connection to that server, because
other people are using that bandwidth as well.

It’s not such a simple problem to solve, because networks aren’t simply
constructed. :slight_smile:
[color=blue]

Of course many CLI users have enough common sense. I’m biased to
precision but Aria would stop if the download already finished.[/color]

Naturally it would stop if the download was done - there would be no more
data to send. :wink:

Jim


Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner