Feeds

Google and the End of Science

Bringing it all back Hume

  • alert
  • submit to reddit

Top three mobile application threats

WiReD magazine's editor-in-chief Chris Anderson has just seen the end for scientific theories. And it is called Google.

This remarkable revelation was triggered by Google's research director Peter Norvig. Speaking at O'Reilly's Emerging Technology Conference in March. Norvig claimed: "All models are wrong, and increasingly you can succeed without them" - a reference to Google's success at linking web-pages with users. Anderson has generalized that idea to science as a whole in a piece titled The End of Theory: The Data Deluge Makes the Scientific Method Obsolete:

"This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves."

Anderson contends that the same applies for all science - its models are inherently of limited value. Either they are wrong, for example they "caricature... a more complex underlying reality" (quantum mechanics), or we don't know how to prove them experimentally (string theories about the universe), or they raise more questions than they answer (epigenetics in biology).

Yet increasing computing power, both in hardware and statistical analysis algorithms, can still bring forth useful correlations, and new interesting discoveries. Anderson cites Craig Venter's DNA sequencing: having done with sequencing individuals, "in 2005 he started sequencing the air. In the process, he discovered thousands of previously unknown species of bacteria and other life-forms."

"The opportunity is great", he adds, because "correlation supersedes causation, and science can advance even without coherent models, unified theories, or really any mechanistic explanation at all."

Over at Ars Technica, John Timmer evinces shock: "I can't possibly imagine how he comes to that conclusion."

He objects: "Correlations are a way of catching a scientist's attention, but the models and mechanisms that explain them are how we make the predictions that not only advance science, but generate practical applications."

The advancement of science is not itself at issue, but the actual examples Timmer counters with do not seem to convince even Timmer himself.

The royal road would be to demonstrate that models are crucial to science, which would be grounds for thinking that they are logically necessary. Timmer takes the short cut on pragmatic grounds: models have utility, regardless of their truth or falsity. Models, so to speak, make the scientific establishment go around.

"Would Anderson be willing to help test a drug that was based on a poorly understood correlation pulled out of a datamine?" Timmer challenges, apparently unembarrassed to be seen in flagrante putting an ad hominem argument. Of course not, which is why we test on guinea pigs. (And why should Anderson be first?)

But if anything, this is a reason Anderson could use. With sufficiently good correlations, it might finally be possible to spare guinea pigs, chimpanzees, or rats trial by laboratory testing.

The irony here is that eudemonic theories of ethics, which is to say the good and right thing to do is to create happiness, such as hedonism (for me) or utilitarianism (for all of us), are philosophically shakier than statistical inference. Anderson's contention is that technology is changing and, in comparison with the continued daily rising of the sun, the outlook for models and mechanisms on inductive grounds seems less sunny than that of tomorrow's dawn.

A closer shave with history

From an initial pass over the arguments for and against Anderson's "end of theory" claim, it seems that several theories about the justification of science might also have to be added to his hit-list. This is what makes Anderson's argument interesting - an analogue perhaps of "the end of history" claim by Francis Fukuyama in his eponymous book.

How could it happen that Occam's Razor, the (ahem) eponymous principle that explanations should not rely on unnecessary entities, has grown so big that it now threatens to sever the hand that once so securely held it - the hand of scientific practice?

Before addressing that, we should be aware of a slippery complexity - semantics. It is not only Google that "washes" meanings, as The Register's Andrew Orlowski noticed.

The term "model" at one time connoted a physical representation, in scientific context and ordinary contexts, for example, of an atom. It seems now to be used in science to cover a wider range of things: not only the virtual representation of physical models (computer modelling and simulation), but any explanatory matrix where two concepts are mediated by other concepts. Pushed this far, it can be difficult to draw the line between a model and an explanation. And between hypothesis, conjecture, theory, and mechanism. Hold the thought as you read on.

High performance access to file storage

Next page: Back home with Hume

More from The Register

next story
KILLER SPONGES menacing California coastline
Surfers are safe, crustaceans less so
Opportunity selfie: Martian winds have given the spunky ol' rover a spring cleaning
Power levels up 70 per cent as the rover keeps on truckin'
LOHAN and the amazing technicolor spaceplane
Our Vulture 2 livery is wrapped, and it's les noix du mutt
Liftoff! SpaceX Falcon 9 lifts Dragon on third resupply mission to ISS
SpaceX snaps smartly into one-second launch window
KILLER ROBOTS, DNA TAMPERING and PEEPING CYBORGS: the future looks bright!
Americans optimistic about technology despite being afraid of EVERYTHING
R.I.P. LADEE: Probe smashes into lunar surface at 3,600mph
Swan dive signs off successful science mission
Discovery time for 200m WONDER MATERIALS shaved from 4 MILLENNIA... to 4 years
Alloy, Alloy: Boffins in speed-classification breakthrough
prev story

Whitepapers

SANS - Survey on application security programs
In this whitepaper learn about the state of application security programs and practices of 488 surveyed respondents, and discover how mature and effective these programs are.
Combat fraud and increase customer satisfaction
Based on their experience using HP ArcSight Enterprise Security Manager for IT security operations, Finansbank moved to HP ArcSight ESM for fraud management.
The benefits of software based PBX
Why you should break free from your proprietary PBX and how to leverage your existing server hardware.
Top three mobile application threats
Learn about three of the top mobile application security threats facing businesses today and recommendations on how to mitigate the risk.
3 Big data security analytics techniques
Applying these Big Data security analytics techniques can help you make your business safer by detecting attacks early, before significant damage is done.