This article is more than 1 year old

Fedora servers are that big

Download Your Way

Letters Yesterday's article on Fedora's FC6 download travails prompted a torrent of emails, not all abusive.

Much to our surprise we stirred up something of an ontological debate, as readers attempted to answer that age-old question: how big is big? Choice cuts below.

The Fedora project has apparently been subject to a sustained DDoS attack. That someone decided to attack on the scheduled release date is unlikely to be a coincidence, and would therefore probably be in the hope of getting Fedora bad press. Well done for falling for it and supporting hackers! Jonathan Larmour

Umm...always happy to oblige, Jonathan.


Well they did say that BitTorrent was the preferred method of downloading FC6, and sure enough I was able to use this method so that the ISOs were sitting waiting for me this morning. FWIW the site breakage also stopped most of the update applications, like yum and Yumex etc, from being able to get any Fedora updates at all, and as ever, there are a whole collection of day 0 updates for FC6 due to the extended package freeze before the release date. Yours enthusiastically, Brian

The Redhat/Fedora download sites were having issues before FC6 was announced. I was helping a friend early last week with a dedicated server install, adding some software he needed, and the tool that talks to fedora servers to get lists of current software (yum) was timing out roughly 3/4 attempts in any given hour.

My conspiracy theory: a competitor bought some zombie boxes and is DDoSing fedora judiciously.

More realistic: I didn't check but I suspect that maybe the FC6 downloads were available (in some fashion) as early as last week ... and the enthusiasts were being so damned enthusiastic.

Jacob L E Blain Christen


To keep things even more interesting, existing users of FC5 have not been able to update their servers for 2 days now :)

Wayne


Red Hat's 'low availability OS' - hardly.

I hit the torrent yesterday, got it in a couple of seconds (it's less than 100 KB), and the ISO came roaring down the line at 600 KB/sec, about as fast as it ever is for me. There were over 5000 users in the swarm I was part of.

Anyone (especially geeks) downloading a 3.25 GB file using HTTP needs a bitchslap, not a faster RedHat pipe. Torrents are very much the right way to do it.

65 TB an hour by the way is a *very* big chunk of bandwidth in anyone's book (2000*3.25GB) - it's 1.8 GB a second (14.4 Gbit/sec) before you consider any overheads (which using HTTP is significant), other page loads, resends for people with dodgier links, etc.

Cut them a little slack please, your headlines are sometimes in danger of heading from friendly irreverence over the line into Daily Mail (spit) "how could this happen?"-style headlines, and that would be a shame, as I usually like your style and the info I get from you.

Dan

Consider us bitchslapped. But the Daily Mail? We didn't mention immigrants once. I've doublechecked.


Don't be misinformed. Linux can handle unreal web loads on one machine. Redhat Just was not setup to scale downloads to that level. The Worlds busiest single machine/server is one (1) linux box. :)

"1.4 The Numbers At the time of writing (May 2005), ftp.heanet.ie; can sustain at least

  • 27,000 simultaneous HTTP downloads
  • can saturate a gigabit interface at the same time
  • has roughly 3.2 million downloads per day
  • ships about 3.5 Terabytes of content per day
  • mirrors over 50,000 projects
  • as 3.7 Terabytes of unique content
  • has circa 6 million unique files ... (on IDE hard drives!)

    http://www.stdlib.net/~colmmacc/Apachecon-EU2005/scaling-apache-handout.pdf

    anon-o-mouse


Well big is always just a matter of persepctive. The thing to bear in mind is that it's 10,000 copies of probably a dvd iso, 3.5 or 4 Gb depending on what your favorite flavor is. So that's about 37Tb of data downloaded over the 5 hours. To manage such bandwidth the other open source tool out this week Firefox would need to have 7.5 million downloads over the same time period, on the page your news story was served on would need to serve up 1.2 billion page views (at 30k a page I realise that there's css and images, but any devoted reg reader probably has them cached ;-) )

To keep up with that kind of data rate they'd need what 1.7 Gbit of outgoing bandwidth, so even to have managed what they did they need more than a unix box or two I'd have thought, or at least more than two network adaptors. And all of this ignores thier normal traffic, and the related traffic (people browsing the fedora pages about the download, and then browsing other stuff on redhat's network while they wait for the download) Not that I disagree with your general feel that they could have done better, but it's not exactly a small challenge.

Rob

NB these numbers are fabrications of my brain and a piece of paper and my be horribly inaccurate, or possibly even accurate. Take nothing for granted, and I accept no responsibility over what these numbers may chose to do on their night off.

®

More about

TIP US OFF

Send us news


Other stories you might like