Feeds

Google search primed for 'Caffeine' injection

A shot in the back-end

Choosing a cloud hosting partner with confidence

Google has completed testing on "Caffeine," a semi-mysterious overhaul of its back-end search infrastructure, and it will soon roll the new platform behind its live search engine.

In mid-August, Google unveiled a online sandbox where it invited world+dog to test the new infrastructure, but as noticed by Mashable.com, the sandbox has been replaced by a brief message from the Mountain View Chocolate Factory.

"Based on the success we've seen, we believe Caffeine is ready for a larger audience," Google's missive reads. "Soon we will activate Caffeine more widely, beginning with one data center. This sandbox is no longer necessary and has been retired, but we appreciate the testing and positive input that webmasters and publishers have given."

Previously, über-Googler Matt Cutts told The Reg that the new infrastructure was under test in a single data center - though he declined to say which one. A Google spokesman indicates that Caffeine will now be moved to a second data center for live deployment, adding that this will happen "over the next few months."

In typical Google fashion, the company has been coy about the design of Caffeine. But Matt Cutts acknowledged that it's built atop a complete revamp of the company's custom-built Google File System (GFS). Two years in the making, the new file system is known, at least informally, as GFS2.

"There are a lot of technologies that are under the hood within Caffeine, and one of the things that Caffeine relies on is next-generation storage," Cutts said. "Caffeine certainly does make use of the so-called GFS2." Caffeine includes other fresh additions to Google's famously distributed infrastructure, but Cutts declined to describe them.

Speaking with The Reg, Matt Cutts described Caffeine as an overhaul of Google's search indexing system. "Caffeine is a fundamental re-architecting of how our indexing system works," he said. "It's larger than a revamp. It's more along the lines of a rewrite. And it's really great. It gives us a lot more flexibility, a lot more power. The ability to index more documents. Indexing speeds - that is, how quickly you can put a document through our indexing system and make it searchable - is much, much better."

Building a search index is an epic number-crunching exercise. Today, Google handles the task using its proprietary Google File System, which stores the data, in tandem with a distributed technology called MapReduce, which crunches it. But these tools are used across other Google services as well, including everything from search to YouTube.

New hybrid storage solutions

Whitepapers

Providing a secure and efficient Helpdesk
A single remote control platform for user support is be key to providing an efficient helpdesk. Retain full control over the way in which screen and keystroke data is transmitted.
WIN a very cool portable ZX Spectrum
Win a one-off portable Spectrum built by legendary hardware hacker Ben Heck
Storage capacity and performance optimization at Mizuno USA
Mizuno USA turn to Tegile storage technology to solve both their SAN and backup issues.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Security and trust: The backbone of doing business over the internet
Explores the current state of website security and the contributions Symantec is making to help organizations protect critical data and build trust with customers.