Boffins: Confusing distributed ledger tech definitions create 'unrealistic expectations' about what it can do

Report proposes tight conceptual DLT framework

Distributed ledger tech - conceptual illustration

Poorly defined and inconsistent terminology for distributed ledger technology systems has led to misconceptions and unrealistic expectations, academics have said.

In a published report, the University of Cambridge Judge Business School researchers propose a new formal definition – and say that using it would render a number of firms' systems "potential" rather than "pure" DLT systems.

man sinks face onto pile of papers... has yellow sticky note reading help on forehead

For €10k, Fujitsu will tell you if your blockchain project is a load of bull

READ MORE

The document (PDF), which runs to a whopping 109 pages, takes a considered look at the state of DLT systems.

It is pitched as an addition to the discussion about the technology, rather than an authoritative take on it, and takes aim at much of the hype around DLT systems.

The team, led by Michel Rauchs, detailed the concepts and properties of DLT systems, as well as the choices and trade-offs that are made in their design, architecture and governance and how these could impact the system.

Underlying this is the argument that the expansion and evolution of DLT has resulted in “the widespread use of language and terminology which is frequently fuzzy, imprecise, and inconsistent across different projects”.

Existing definitions range from being so specific they are inaccessible to general audiences, to being so simplistic that they don't distinguish DLT from a traditional database.

However, the lack of common terminology "has resulted in misconceptions and the widespread formation of unrealistic expectations as to what this technology can achieve", the authors said.

Existing definitions range from being so specific they are inaccessible to general audiences, to being so simplistic that they don't distinguish DLT from a traditional database.

"Left unsolved, this disorderly use of language and conceptual terminology could hinder development within the DLT sector, and may present society and industry with legal uncertainty and financial risks which are as yet unrecognised."

The report proposed a formal definition for a DLT system that sets out five properties that a DLT system should be able to ensure with little modification. This says a DLT:

  1. enables a network of independent participants to establish a consensus around...
  2. the authoritative ordering of cryptographically validated ("signed") transactions. These records are made...
  3. persistent by replicating the data across multiple nodes, and...
  4. tamper-evident by linking them by cryptographic hashes.
  5. The shared result of the reconciliation/consensus process - the "ledger" - serves as the authoritative version for these records.

“A DLT system is a distributed recordkeeping system that operates in an adversarial environment and is collectively maintained and updated by multiple entities. Every participant needs to be able to independently verify the validity and integrity of transactions and ultimately the system state. Finally, any attempt to tamper with transaction history needs to be trivial to detect and difficult to perform.”

When this definition is used, the authors said that some self-proclaimed DLT systems don’t tick all the boxes. These “can thus only be considered ‘potential DLT systems’ that have the basic architectural features to allow eventual evolution into ‘pure’ DLT systems”.

An example of such a system is Ripple, with the report saying that “Ripple Labs’ influence over validator nodes makes both multi-party consensus and tamper resistance properties contentious”.

Three layers and a framework

Beyond the formal definition, the report aims to drill down into DLT systems by dividing them into three interdependent core layers: protocol, network and data.

The protocol layer defines, manages and updates the global ruleset that governs the system; the network layer implements this ruleset and performs the steps required to reach system-wide consensus; and the data layer specifies the nature and meaning of the data over which agreement is reached.

Each of these layers is broken down into components, and the components into processes. The report also sets out the actors in the system - developers, administrators, gateways and participants – and where they are most active.

In addition, the authors have looked at how decisions impact the overall system, and where trade-offs have to be made, concluding that the most common trade-off is between decentralisation and performance.

For instance, the paper noted that early DLT systems focused on keeping all aspects of the system decentralised to improve censorship resistance. But this “came at significant cost: inefficient redundancy, inherent scaling limitations, low throughput, slow confirmation speed, high energy costs, and poor user experience”.

Pulling the report together, the authors use the layers, components and processes to create a “conceptual framework” that can be used to identify, analyse and compare DLT systems.

They pitch it as a way for regulators to see exactly where authority is held in a DLT system and in that way assess who can be held accountable for the resulting technology and outcomes.

Meanwhile, investors could use it to better understand the credibility of the proposals they receive – venture capital firms have complained about being inundated with white papers that have little real potential.

The authors also suggested that the framework could be used by businesses and engineers developing their own systems, and by academics and researchers as a foundation for their work.

However, the team acknowledged that, although they tried to keep the analysis objective, it is hard to objectively quantify the more abstract aspects of DLT systems, like decentralisation.

As such, there will have been some subjectivity, based on their conceptions of the technology stack and the roles of actors. Proposed future work could look at the more technical aspects of the processes. ®




Biting the hand that feeds IT © 1998–2018