US White Space XML blueprints emitted
Database synchronisation, the hipster web way
Eight of the companies planning to run US databases of White Space have published the XML schema, and polling protocol, they intend to use to keep their data synchronised.
The two documents, Database Interoperability Specification (65-page PDF/1.2MB, very XMLy) and the Channel Calculation Guideline (19-page PDF/965kb, very mathematical), have been submitted to the FCC along with a commitment from the original eight companies who were awarded the right to run White Space databases that they'll follow the specifications.
That's important because all ten companies (two joined later) running the databases must provide the same data, or the whole thing breaks down. The firms, which form the White Space Database Administrator Group, have created specifications for web-based polling, and pushed copies, of XML files detailing changes, or a complete copy of the database where the last update is more than 72 hours old.
Polling between databases can be done every second if necessary, or every 1,000 seconds for the more-laid-back data source. As the databases are only supposed to contain information about television broadcast, this might seem excessive but the mechanisms involved have applications way beyond unused TV spectrum.
White Space devices, be they access points or clients, will operate in TV-broadcast spectrum that isn't being used locally. When a White Space access point is switched on it connects, over the internet, to a chosen database and supplies its location. The database responds with a list of radio frequencies that are available locally. The access point selects one (or more) to use, and passes that information on to its clients.
How it communicates with its clients, and what protocol those clients use to communicate with the access point, is beyond the remit of the database administrators, and they'll likely be lots of alternatives, but the databases do need to communicate with each other to ensure the data is consistent.
Why so many databases?
One might imagine that a single database, perhaps run by a public authority, would be enough, but that would be denying the advantage of the free market in driving innovation. So they'll be ten databases supplying the same data and competing for customers.
The customers will likely be the access point manufacturers, rather than the end users. The cost will be tied up in the purchase price of the kit, though alternative business models are possible.
But if all you want to know is which TV channels are being used locally then synchronising ten databases every second might seem excessive: even in America television doesn't change that quickly.
But if one were to start registering wireless microphones then it would be more time sensitive. The FCC has already conceded some bands for wireless mic use, but would like the ability to dynamically adjust that. It's also possible to imagine White Space devices themselves asking for spectrum to be reserved rather than just checking if it's free, and some in the industry are already talking about using White Space access points as sensors, to create a dynamic map of spectrum availability updated on a second-by-second basis.
That's a long way off, but if it worked then there's nothing to stop the same model being applied to other radio bands. There are huge swaths of spectrum owned by the Department of Defence, for example, which it would happily share as long as it could be guaranteed priority when needed, something the White Space Database Administration Group would happily see its protocol doing.
There's nothing very controversial in the proposals, which are largely unintelligible to those who don't speak XML, but they hammer down a lot of details which could be really important if the White Space model is going to become the way we manage radio spectrum in the future. ®