FCC rubberstamps rules for 'WiFi on steroids'

How to fill the white spaces

Build a business case: developing custom apps

The FCC has approved the final rules for unlicensed access to the TV "white space" spectrum, paving the way for what is commonly called "WiFi on steroids."

The commission has dropped the sensing requirement for white space devices, though we still don't know who gets to run national databases telling people what frequencies they can use and where.

That means a fixed white space device, such as an internet access point, will only need to check in daily to confirm that no new TV transmitters have been switched on, while clients can take their cue from the access point with neither device having to beacon its presence or monitor for other radio users. That should ensure that white space devices are cheap, and plentiful, and that White Space networking is quickly going to spread around the world.

At today's vote the FCC decided that sensing would be optional, and tasked the Office of Engineering and Technology to work out the mechanisms by which the necessary databases will operator as then end up as the only thing preventing white space users destroying broadcast TV.

White spaces are frequencies allocated to TV broadcasting that aren't being used (locally) to broadcast TV. Television is broadcast from huge transmitters with enormous coverage areas, which means that even in the dense areas there are gaps where a frequency can't be used for TV, but could be used for short-range wireless, while in rural areas huge quantities of spectrum go unused.

Drawing showing how white spaces exist

Two broadcasts can't use the same frequency, despite the fact that they both cover large areas where the other's frequency lies empty, these are the white spaces

A user at point A can happily transmit at 400MHz, despite the fact that the frequency can't be used for broadcast TV. Where point A lies, and how big it really is, are still being debated. Broadcasters get first dibs on the frequency, so if our chap starts interfering then the database will have to tell him to use a different frequency.

It had been suggested that equipment at point A would have to constantly sense to see if anyone else was using the channel, but that would have made the devices more expensive and never worked terribly well anyway, so the requirement has been dropped. Devices will still have to check in to ensure that a frequency isn't being used for TV transmissions, or by licensed wireless microphone.

In the UK we have a single body that keeps track of wireless microphones, which have been utilising white spaces in this way for decades. Google had proposed having a single database in the USA for TV and microphone users, run by Google obviously, but the FCC rejected that approach for a competitive model. With multiple databases being used in the US a wireless microphone user will only have to register with one, which will then synchronise with the others, so the next time a nearby white space device checks it can be told to clear the channel.

In most cases that shouldn't be a problem, as users generally know months in advance when they're going to need the spectrum - for a theatrical performance or sporting event. Reporters and more ad-hoc users don't tend to use white space anyway, but just to be sure the FCC has allocated two bands, comprising 12MHz of spectrum, for exclusive wireless microphone use.

That might sounds sensible, but it also means that 12MHz of radio spectrum in the middle of rural Texas is unusable, just in case anyone fancies switching on a wireless mic. Rural Texans might have preferred that spectrum to be providing better internet access.

The most obvious use of white space is point to point connections: bridging between buildings miles apart without the need for line of sight, and that's where white space will initially be used, and quickly too. Spectrum Bridge is already running four test networks under a special licence, and Microsoft now connects up its campus buildings and vehicles using a white space network. Both companies hope to transition those test networks before their temporary licences expire, providing continuity of service.

But for normal people it will be next year before we see any consumer kit. Manufacturers know the technology well, but have been waiting for the FCC decision before running up the production lines. Even now the FCC has to specify the database capabilities, and agree which companies are going to host databases. Then the kit can be tested, and the FCC will need to spend a month or two certifying it. So realistically it will be 2012 before we find out if white space really is wi-fi on steroids.

The first deployments will be Wireless Internet Service Providers (WISP) who are desperate to get access to white spaces in order to provide internet access to communities without having to worry about line of sight. The WISP community is also petitioning to have the maximum antenna height lifted, but the FCC is still considering that. Equally exuberant is xG Technology, who've been punting the concept of mobile phones in the (unlicensed) 900MHz band. xG reckons white space increases the capacity of an unlicensed network, making it more able to compete with the existing operators.

If all this works then we can expect the idea to jump the pond pretty quickly. US kit won't work over here: thanks to PAL our TV channels are 8MHz wide, while the US uses 6MHz-wide channels, and given that every white space access point will have to have GPS it won't be easy to import them. But if the exploitation is half as effective as the proponents claim then expect white space legislation to be rolled out very quickly.

And if it goes wrong, then the first devices get sold and the TV companies start seeing interference. They'll complain, so the databases shrink back the areas and frequencies in which white space can be used, eventually to the point where the technology is only useful for point to point connections in the most rural of areas near coastlines.

But worse... much worse... we will all have to accept that Dolly Parton was right.

The essential guide to IT transformation

More from The Register

next story
6 Obvious Reasons Why Facebook Will Ban This Article (Thank God)
Clampdown on clickbait ... and El Reg is OK with this
So, Apple won't sell cheap kit? Prepare the iOS garden wall WRECKING BALL
It can throw the low cost race if it looks to the cloud
Time Warner Cable customers SQUEAL as US network goes offline
A rude awakening: North Americans greeted with outage drama
Shoot-em-up: Sony Online Entertainment hit by 'large scale DDoS attack'
Games disrupted as firm struggles to control network
BT customers face broadband and landline price hikes
Poor punters won't be affected, telecoms giant claims
Netflix swallows yet another bitter pill, inks peering deal with TWC
Net neutrality crusader once again pays up for priority access
EE plonks 4G in UK Prime Minister's backyard
OK, his constituency. Brace yourself for EXTRA #selfies
prev story


Top 10 endpoint backup mistakes
Avoid the ten endpoint backup mistakes to ensure that your critical corporate data is protected and end user productivity is improved.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Backing up distributed data
Eliminating the redundant use of bandwidth and storage capacity and application consolidation in the modern data center.
The essential guide to IT transformation
ServiceNow discusses three IT transformations that can help CIOs automate IT services to transform IT and the enterprise
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.