This article is more than 1 year old

Wi-Fi was MEANT to be this way: Antennas and standards, 802.11 style

Plus: Why your phone's (sometimes) crap at wireless

Forged nearly 20 years ago, the 802.11 wireless networking standard was responsible for cutting the cord and letting us roam. During that time, 802.11 has evolved as devices using it have both proliferated and got smaller – while the data they swallow has grown in quantity and in size.

In March the IEEE OK’d the latest chapter in the 802.11 story – 802.11 ay. This packs more bandwidth in the 60GHz spectrum and promises speeds of up to 20Gbit/sec – that’s ten thousand times faster than 1997’s standard. It targets display ports, HDMI, and USB, suggesting that it will be used to serve more short-range, high-bandwidth connectivity needs such as TV and monitor displays.

Consider 802.11ay a beefed-up version of 802.11ad, with the same broad applications.

And you may ask yourself, well... how did I get here?

The 802.11 standard was ratified in 1997, transmitting at up to 2Mbps (theoretically). Since then, the 802.11 standards process has become an alphabet soup, with different extensions to the standard being processed under different combinations of letters.

It was always meant to be this way. At any point, the IEEE is working on several of these extensions, which usually address changes at different levels of the networking stack that underpin the standard. For example, 802.11i was an enhancement to the MAC security layer originally ratified in 2004, but it eventually melted into a maintenance revision of the main standard in 2007.

802.11 a and b were the first extensions to the standard, ratified in 1999. These bought 802.11 into the realm of Ethernet transmission speeds, offering a theoretical 54Mbit/sec and 11Mbit/sec data transfer respectively.

It was this pair of standards – the first 802.11 to earn the Wi-Fi moniker – that really began to draw industry interest as a means to finally cut the cord in the office.

802.11a allowed operation on the 5GHz spectrum, which was less congested at the time compared to 802.11b’s 2.4GHz, which is heavily polluted by everything from microwaves to baby monitors.

Because 802.11a and b worked on different frequencies, they didn’t speak to each other. To solve the problem, the IEEE bought out 802.11g, which provided the same raw throughput as the 5GHz standard, but operated over the 2.4GHz frequency. That enabled it to interoperate with the 802.11b standard, but at far higher speeds.

What’s a "draft-n" access point, anyway?

Sometimes, there can be confusion for consumer customers thanks to the way that the industry handles standards that are still being ratified. 802.11n caused particular problems, when a draft rejected at the first vote led to a swathe of kit labelled with “draft n” and “pre-n” for bandwidth-hungry punters to buy, while the IEEE worked on a second draft. The Wi-Fi Alliance has a lot to do with this, because it has the final say on certifications for Wi-Fi equipment that get to use its logo, and it has certified equipment as pre-standard to let vendors get it out of the door and onto the shelves.

The industry was so eager to push out 802.11n because of the significant enhancements to performance. This was partly down to the physical configuration of the equipment, because 802.11n was the first of the extensions to really innovate in antenna science.

The standard used multiple input and multiple output (MIMO), as a means of enhancing throughput. MIMO uses multiple antennas to send and receive signals in a concept known as spatial streaming, which enabled the signals to be more easily picked up from further away. MIMO terminology refers to the number of antennas in the access point and in the device, (say, 3+2, if your access point has three but your laptop only has two). The more antennas, the better.

How do vendors deal with details in pre-standard equipment that may change in the final standard? This is partly offset by the IEEE’s process when creating an 802.11 extension, which is to work out what must be nailed down in hardware, and what can be put into software. This allows at least some functionality to be altered in a firmware update.

More about

TIP US OFF

Send us news


Other stories you might like