Feeds

Google touts real-time RSS transplant

What's all the PubSubHubbub?

5 things you didn’t know about cloud backup

Web 2.0 NY Google is trumpeting a new messaging protocol it insists on calling PubSubHubbub.

Brett Slatkin - a software engineer on Google’s App Engine team - demonstrated the protocol this week at the Web 2.0 Expo in New York. It aims to turn RSS and Atom into real-time content delivery mechanisms - and maybe even revamp Google search.

In a traditional RSS or Atom implementation, there's a publisher, and there's a subscriber. The subscriber polls the publisher for updates, and if publishers has an update, it pushes it down the pipe. As Slatkin explains, this can be costly for organizations posting content. It results in heavy traffic from subscribers polling for content, and there's unnecessary bandwidth eaten up with each push. Entire feeds are sent to subscribers, not just the changed.

And since the subscriber has the responsibility of requesting new content, RSS feeds aren’t updated in real-time. New content is only available once it's been requested. Say, for instance, a site has an RSS feed embedded on the homepage. New posts to that feed won’t instantly appear on the site where the feed is embedded.

The drill

Simple in nature, PubSubHubbub relies on hubs and differentials to transform RSS and Atom feeds into real-time updates and significantly reduce the amount of bandwidth used.

A publisher signs up with a hub provider, and the feed sent to subscribers includes a declaration that points the user to the hub address, telling them the hub is a trusted entity. At that point, the subscriber has the option to subscribe with the hub for real-time delivery.

Whenever the publisher adds new content to the feed, the feed is sent to the hub. The hub, in turn, looks for differences in the feed, removes the content that the subscriber has already received, and multicasts a partial feed that includes just the new content to subscribers.

Because this is designed to work with RSS and Atom, publishers don’t have to implement new solutions. They can continue to use their existing feeds for instant syndication.

A publisher can also deploy its own hub. In this way, Slatkin explained, a publisher can create a system where it pushes out the content on its own in real-time to subscribers whenever content is published.

5 things you didn’t know about cloud backup

More from The Register

next story
BBC: We're going to slip CODING into kids' TV
Pureed-carrot-in-ice cream C++ surprise
6 Obvious Reasons Why Facebook Will Ban This Article (Thank God)
Clampdown on clickbait ... and El Reg is OK with this
Twitter: La la la, we have not heard of any NUDE JLaw, Upton SELFIES
If there are any on our site it is not our fault as we are not a PUBLISHER
Facebook, Google and Instagram 'worse than drugs' says Miley Cyrus
Italian boffins agree with popette's theory that haters are the real wrecking balls
Sit tight, fanbois. Apple's '$400' wearable release slips into early 2015
Sources: time to put in plenty of clock-watching for' iWatch
Facebook to let stalkers unearth buried posts with mobe search
Prepare to HAUNT your pal's back catalogue
Ex-IBM CEO John Akers dies at 79
An era disrupted by the advent of the PC
prev story

Whitepapers

Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Consolidation: The Foundation for IT Business Transformation
In this whitepaper learn how effective consolidation of IT and business resources can enable multiple, meaningful business benefits.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?