EU urged to ignore net neutrality delusions, choose science instead
Establish facts before making broadband regulations? Might be an idea
A broken theory
The underlying theory of ‘net neutrality’ advocates a virtuous cycle of innovation. The more users there are, the more applications get written, which drives more users. The leap is then made to to ‘neutrality’. This Utopian ideal (single class of service, ‘best effort’, users pay all performance costs) supposedly maximises the flywheel effect.
The presumptive basis is to minimise risk and cost to developers, and maximise choice for users. This theory is flawed in five key ways:
- Is assumes applications get the predictable performance they need. We can be sure that many applications don’t exist today because the performance of ‘best effort’ is unpredictable, so by definition they aren’t written and don’t get traction.
- It assumes that all users and developers are internalising their costs. They are not. Many applications are effectively pollution of a shared resource, and protocols are aggressively fighting for finite resources.
- It assumes there is no cost of association. A flat global address space where everything is reachable may sound attractive, but it comes with non-zero security and routing costs.
- It assumes that developers are entitled to write distributed applications with no engineering costs for performance (eg issuing profiles to DPI vendors, marking traffic). This is delusional.
- It assumed there is a mechanism for users to configure performance directly when needed. Today, that is absent. Regulators that attempt to sustain today’s mispricing of performance will find their rules incentivise a mis-allocation of resources, open up market arbitrages, and repel capital from the telecoms industry.
What should Europe do? Ignore the lawgeneers, and be scientific
The FCC went ahead and made rules about ‘net neutrality’ without getting its technical house in order first. This was done at the behest of cohorts of well-funded lobbying lawyers.
As a result it has put at risk the FCC’s credibility, since those rules are in conflict with the technical and economic reality of broadband. The article cited here is merely an exemplar of a sizable body of academic literature on ‘net neutrality’.
This literature exists in a self-referential citation bubble disconnected from actual broadband network operation. A common failing is to call for ‘faster than math’ packet scheduling.
This does our industry and society a disservice, and harms the credibility of the institutions whose names are attached to these works. Their authors' misguided attempts to control the definition and direction of ISP services must be resisted.
I strongly urge European regulators to ignore these campaigning ‘lawgeneers’. They have no ‘skin in the game’, so suffer no consequences for their pronouncements based on false technical assumptions.
This is a form of ‘moral hazard’. At least ISPs have a stake in the long-term viability of their services. The way forward is for regulators to establish a solid body of scientific knowledge within which the necessary debates can occur.
This needs to be done by stochastics experts and computer scientists, not lawyers. The one (and only) thing that should be ‘neutral’ is the resulting framework in which a debate over justice and fairness is held.
In particular, broadband has performance and cost constraints. So what are they? We can then have a policy debate that sits within those constraints, just as spectrum policy respects the laws of physics and electromagnetism.
Ofcom has laudably made such a move to establish a basis of scientific fact from which to make broadband regulations. It has cleanly separated the science and policy issues. This process needs to continue and spread.
If you would like to join a movement for reality-based regulation, please do feel free to get in touch to discuss how this might be brought about.
A version of this article appeared at Martin's blog, and is reprinted with permission.