Competition crowdsources blisteringly-fast software
TopCoder challenge helps immune system research
If you want a massive improvement in the software you use, the cheapest way to get it is to host a competition on TopCoder.
That seems to be at least one of the discoveries made when a group of research biologists staged a competition on the MIT-operated site. A two-week contest with regular prizes of $US500 ended up costing the researchers just $US6,000, and yielded new – and hugely efficient and effective – software for analysing immune system genes.
The real-world problem presented by the researchers was to analyse the genes involved in producing antibodies and T-cell receptors. This is definitely a non-trivial problem in genetic research. As Nature puts it:
“These genes are formed from dozens of modular DNA segments located throughout the genome, and they can be mixed and matched to yield trillions of unique proteins, each capable of recognizing a different pathogen or foreign molecule.”
With that kind of complexity, the problem is demanding on computing resources and software.
Hence the competition: the lead researcher, Eva Guinan (of the Dana-Farber Cancer Institute) and her collaborators asked TopCoder participants if they could do better: “The researchers offered TopCoder what they thought would be an impossible goal: to develop a predictive algorithm that was an order of magnitude better than either a custom solution developed by Arnaout [Ramy Arnaout of the Beth Israel Deaconess Medical Centre] or the NIH’s standard approach (MegaBLAST)”.
The result was a huge success, 84 solutions were offered by entrants in the competition, 16 of them outperforming MegaBLAST.
The best-of-the-best was 970 times faster than either MegaBLAST or Armout’s software, which should go some way towards Guinan’s perfect world in which the researcher could run this kind of analysis on their laptops instead of supercomputers.
There were 733 participants in the competition, of which 122 submitted code; 44 percent of them were software professionals, and the rest were students at various levels.
For The Register, this is a killer observation: to make the problem accessible for the competition, “they had to first reframe the problem, translating it so that it could be accessible to individuals not trained in computational biology.”
In other words, if you ask the right question, you can get the right answer – remarkably cheaply. ®
Re: The arrogance
The competition is proof that they thought that someone else could achieve what they admittedly failed to do
You'll also notice that what you quoted is a quote from the original article, not a quote from the scientist. The original article continues with a real quote from the scientist:
“This is a proof-of-concept demonstration that we can bring people together not only from different schools and different disciplines, but from entirely different economic sectors, to solve problems that are bigger than one person, department or institution,”
Re: All your IP rights are belong to...
The entire point is that you're creating an algorithm for someone else to 'exploit'. If you don't like the terms, don't invest your time.
Glad I don't work for the firm that developed the original software
Bit embarrassing to have your 'can't be improved' algorithm bested by 97000%, for $500 of code.
Where is the Alan Sugar 'You're Fired!" Icon?