Facebook's turbo-charged Instant Articles: Another brick in the wall
Want your news crud-free and open? There is hope
Facebook recently opened up its Instant Articles platform to all publishers after an early test run with an anointed few publishers. In a nutshell, it strips out everything a publisher has festooned across their site.
In its current form, it uses a specially crafted RSS feed to reformat articles for Facebook.
The result is a cleaner reading experience that also loads much faster.
Google has a very similar publishing tool available known as Google AMP, though - in true Google fashion - AMP eschews existing standards like RSS or JSON in favor of re-inventing the wheel. Apple's News tool is designed to do much the same things, but it too uses an RSS-based approach.
The current situation - three competing formats for three competing publishing platforms - reeks of the XKCD favorite, Standards. It's also a lot of work for publishers who'd like to support all three.
Aside from a few platform specific details - all three want media like image or video wrapped in different tags - the overall goal is the same with all of these tools. They're performance frameworks.
However, with a couple of exceptions, most notably the use of a content distribution network to cache pages, there's nothing in Google's AMP spec or Facebook's Instant Article feeds that publishers can't do for themselves.
Except that they aren't doing it for themselves, so Google and Facebook have taken matters into their own hands, offering the promise of their content hungry audiences in exchange for less encumbered content.
So what do publishers get out of reformatting their content to suit Google and Facebook? Nothing directly. What Instant Articles and Google AMP pages provide is verification for themselves. Google knows that an AMP page will be fast because - assuming it's validly formatted - it can't not be fast. Ditto properly formatted Instant Article content. Performance is the whole goal of both efforts and performance is built into the requirement of the format.
The downside for publishers, and ultimately the rest of us, is that their verification comes from using a specific set of tools, tools that Google may decide it doesn't care about in two years, tools that Facebook may alter or get rid of at any time. The verification of speed that's inherent in these formats is inexorably tied to the platforms they're a part of and ultimately only helps those platforms.
Visit a slow site outside of Facebook's mobile app and it's still going to be slow and its still going to chew through your data plan thanks to its superfluous downloads.
What if the web itself had some kind of performance policy? What if there were a standardized way of publishing a subsection of content with the express goal of providing a stripped down, fast reading experience?
That's the goal behind a proposed web standard known as Content Performance Policy from the World Wide Web Consortium’s Web Platform Incubator Community Group.
Yes, folks, another standard to save us from a quagmire of competing quasi-standards. But, remember, none of the other offerings are actual, official standards.
To be fair, the Content Performance Policy proposal is a long way from being an actual standard, too, but the brainchild of web developers Tim Kadlec and Yoav Weiss wants to give developers a: "Standardized way to provide similar verification [to what AMP offers]. Something that would avoid forcing developers into the use of a specific tool and the taste of 'walled-garden' that comes with it."
The proposed spec lists a variety of optimizations sites could use to speed up their content, including delayed image loading, throttling CPU consumption, no auto-play of audio and video content, and no external scripts among other things.
But since publishers and site owners aren't actually following those best practices, the standard helps to define an alternative set of content. Which is to say that a publisher would do with the CPP formatted page exactly what it does now with Google AMP pages, publish them alongside their regular, bloated content.
In short, it standardized current best practices for creating high performance websites. Google and Facebook would still get all the benefits their homegrown offerings offer - namely, a guarantee that the page in question will load quickly - but so would every other content-consuming party on the web.
User-agents would then be able to choose the content to display. This is helpful for user-agents that will pull content into other sites. For example, Facebook when it includes articles in its timeline or Google's carousel. But it would also allow, for example, mobile browsers on a limited-bandwidth connection to do the same thing.
Rather than the ad blocker that iOS currently employs, users could simply check a box to "prefer faster pages" when sites make a CPP page available. In fact, from a user point of view it's hard to imagine a scenario in which the faster loading and less cluttered content that CPP pages envision would not be preferable.
It's not just user agents though. This might also end up being that carrot that convinces publishers to come back around to building faster websites.
If the vast majority of traffic is going to the optimized version of a site while the festooned version sits virtually unused, perhaps publishers will finally come around.
Without a standardized way to do that, publishers, users, everybody remains at the mercy of the walled gardens jockeying for de facto standard status. ®