Related topics
  • ,
  • ,
  • ,

Ofcom follows America into the White Space

What he said, only with a British accent

Ofcom is consulting on white space in order to decide exactly how closely we'll be following the database model recently approved by the FCC. It also appears to be dropping any notion of the UK going its own way.

That means no frequency sensing; a client-hub architecture; and private companies hosting databases of available spectrum. But it also means no licences or auctions either, and we're going to stop saying "cognitive radio" and "interleaved spectrum" too - 'cos the Americans never say that kind of thing.

The official term is now "white space", but it still refers to the use of frequencies that are used to transmit TV signals elsewhere in the country, but can be used at low power without risk of interference.

Ofcom is reserving the right to allocate some of the white space to local TV transmissions, but hasn't decided how many they'll be (if any). The regulator has decided (subject to the consultation) that hub devices will have to check with an online database every two hours and that client devices can take their nod from the hub - so we're not completely copying the Americans: they call those Mode 1 and Mode 2 devices respectively, terms we'll no doubt be adopting soon.

We're also adopting the US model of having multiple databases, with Ofcom hosting a list of those databases. That approach has led to more than a year of debate in the USA, as parties argue about the architecture and synchronisation of those databases as well as who's going to run them. Ofcom isn't being drawn on that stuff, and is presumably waiting to see what the FCC decides.

Ofcom does describe a process by which a hub device connects, every two hours, to a server hosted by Ofcom to download a list of approved databases. The regulator's server then checks that list for a preferred database based on the device's location, height, and model number. The database then passes back a list of frequencies which might be clear (no promises, there might be another white space user beside you). The hub then picks one, and passes its decision on to the client devices.

How all of this is paid for is also up for debate. In the US Google offered to run the database for free, or, more accurately, in exchange for the location and model of every white space device in America. Other companies might want paying, and Ofcom is open to suggestions.

The problem is that these models ignore the most likely application of white space technology - point-to-point connections that provide internet access to communities that don't have it. Such connections don't need to be checked every two hours, and could be deployed now without databases and hourly checks - as demonstrated by the University of Strathclyde connecting up Scottish islands.

Light licensing (along the lines of the 5.8GHz, 50-quid-a-year, licence) would cover the administrative costs, and the spectrum could usefully (and quickly) address the not-spot problem. That model works even if it turns out what UK white space is just a mirage as some predict.

But that's not what the Americans are doing, and not what Ofcom is seeking comment on - comment which needs to be with them by 7 December. Mostly it is not nearly as much fun as predicting the next Wi-Fi, even if it will be (by Ofcom's own guess) 2014 before it happens. ®

Sponsored: 10 ways wire data helps conquer IT complexity