Feeds

Scientists shun Web 2.0

Catch 22.0

Secure remote control for conventional and virtual desktops

SXSW Science publishers' efforts to have the research community sup the Web 2.0 Kool-Aid have failed, and scientists have given a resounding thumbs down to a gamut of crowd-tapping initiatives, showgoers at SXSW heard on Saturday.

A panel of science web publishers said scientists had consistently shunned wikis, tagging, and social networks, and have even proven reticent to leave comments on web pages.

The refusnik stance presents a puzzle in light of arguments in favour of Web 2.0 services which are more compelling for science than for trivia - the biggest web 2.0 market to date. The science game gave the world peer review after all, and scientists have often lauded and contributed to Wikipedia, despite its well-documented eccentricities and flaws.

Bio-Med Central boss Matt Cockerill invoked the example of the SWISS-PROT database to illustrate the value scientists could extract from greater online collaboration. The database is the hand-curated gold standard for protein sequence information, but the current backlog of proteins constantly being turned up by automated research techniques would take SWISS-PROT thousands of years to annotate. Convincing the research community to enter the information wiki-style, make the links to other proteins, and document the function would speed matters up considerably.

Digg-style bookmarking could work as a short cut to maximising the impact of scientists' work too. The impact factor of research papers has hither to been measured by how many later articles cite them; a painfully slow drip which takes years to build up.

The penetration problem seems to stem from the extremely competitive and rigorous funding process. Research projects have to justify every penny and minute spent by their scientists, presenting a catch 22 for web 2.0 as a tool for science. Researchers won't use the tools until they justify their worth, but they are worthless unless researchers use them.

It's a conundrum that makes science a notoriously conservative market for publishers. Nature's head of web publishing Timo Hannay confessed that of the firm's myriad Web 2.0 projects, only a couple bring in any revenue.

Perhaps their experience with Web 2.0 is not to be so different after all. ®

Next gen security for virtualised datacentres

More from The Register

next story
Boffins attempt to prove the UNIVERSE IS JUST A HOLOGRAM
Is this the real life? Is this just fantasy?
Our LOHAN spaceplane ballocket Kickstarter climbs through £8000
Through 25 per cent but more is needed: Get your UNIQUE rewards!
NASA to reformat Opportunity rover's memory from 125 million miles away
Interplanetary admins will back up data and get to work
LOHAN tunes into ultra long range radio
And verily, Vultures shall speak status unto distant receivers
SpaceX prototype rocket EXPLODES over Texas. 'Tricky' biz, says Elon Musk
No injuries or near injuries. Flight stayed in designated area
Galileo, Galileo! Galileo, Galileo! Galileo fit to go. Magnifico
I'm just a poor boy, nobody loves me. But at least I can find my way with ESA GPS by 2017
EOS, Lockheed to track space junk from Oz
WA facility gets laser-eyes out of the fog
prev story

Whitepapers

5 things you didn’t know about cloud backup
IT departments are embracing cloud backup, but there’s a lot you need to know before choosing a service provider. Learn all the critical things you need to know.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up Big Data
Solving backup challenges and “protect everything from everywhere,” as we move into the era of big data management and the adoption of BYOD.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?