Wi-Fi Has Rocked Our World

By Jamie Easley, Airheads Community Manager

Not having Wi-Fi has become a luxury good. People pay to vacation off the grid. Doing a “digital detox” has become a thing.

But in reality, I don’t want to live without the huge conveniences of mobility. At home, my front door automatically unlocks as I approach. I use Wi-Fi to turn the lights on and off and change the temperature. No one in my house has any tolerance for games that lag at precisely the wrong moment or a pause in Game of Thrones.

I use Wi-Fi to track calories burned as I jam up monster hills on my bike. My heart rate monitor really gets going when I’m on a motorcycle adventure. And, of course, Wi-Fi helps me make sure I get enough beauty sleep.

A workplace without Wi-Fi might as well as have chalkboards and oil lanterns. I wouldn’t be managing the Airheads community. I wouldn’t have met so many people who are so smart and passionate about networking. Being tethered to my office would drive me nuts, although I do admit that I wish I had fewer conference calls and emails!

It wasn’t really that long ago wireless LANs and smartphones were mind-blowingly new. BlackBerry was the original addictive device. Today, the Palm Pilot, Apple Newton and the much-maligned Windows Phone are museum-pieces.

Let’s take a quick look at the origins of Wi-Fi.

A Look Back at the History of Wi-Fi
The IEEE 802.11 committee was formed in 1997. At the time, communicating at 2Mbps without wires seemed like a miracle. A 10Mbps wired Ethernet connection was much faster, but wireless LANs were a huge convenience. 802.11b was the first widely accepted wireless standard, which was followed by 11a, 11g, 11n, 11ac and now, 11ax. 

Back in 1999, when 802.11b was introduced, mobile was really catching on. Apple introduced iBook. It was amazing to walk into a conference room with a laptop, sit down and get to work without fumbling around for an Ethernet jack. At 11Mbps, the speed paled in comparison to Fast Ethernet, but the convenience was unbeatable. Unfortunately, microwave ovens, baby monitors, cordless telephones, Bluetooth headsets and many other things also used the 2.4GHz band, which often resulted in RF interference and poor performance. (Can you hold off on microwaving your burrito while I download this file?)

802.11a, which was introduced at the same time as 802.11ab, operates in the 5GHz band with a maximum data rate of 56Mbps. 802.11a was widely adopted, especially in companies. Using the 5GHz band overcame the crowding issue in the 2.4GHz band, but the higher frequency also meant that the overall range was shorter. An 802.11a access point could cover less than a fourth of an 802.11b/g radio. Talk about hitting a brick wall.

The next iteration, 802.11g, was introduced in 2003. 802.11g used the 2.4GHz band, but had a new modulation scheme. It could reach a theoretical data rate of 54Mbps, but about 22Mbps was more realistic. 802.11g was adopted rapidly, and most wireless access points then supported 802.11a as well as 802.11b/g. That was pretty cool, although if a legacy 802.11b/g client joined, the performance of the overall network would often suffer.

Mobility continued to capture people’s imaginations. Apple introduced the iPhone in 2007, and everything changed. Until then, most wireless networks were designed to maximize RF coverage, but as the number of mobile devices began to grow, the directive changed to designing for capacity.

Ratified in 2007, 802.11n was a workhorse. It could operate on either the 2.4GHz or the 5GHz band. One of the key innovations was Multiple Input Multiple Output (MIMO), which enabled the use of multiple antennas and resulted in significant increases in the data rate without the need for higher bandwidth or transmit power.

802.11ac introduced the concept of gigabit Wi-Fi. Finally wireless speeds were approaching wired. With the 2.4GHz band so crowded, the attention shifted to using the 5GHz band. Wave 1 of the standard was ratified in 2013, and promised a 1.3Gbps data rate. Wider channels, more spatial streams and higher-order modulation contributed to the better performance.

Wave 2 products hit the market a year later, offering speeds up to 6.9Gbps. Wave 2 also introduced Multi-user MIMO, which enabled access points to send multiple streams to multiple client at a time, and is also used in 802.11ax. Today, most organizations use 802.11ac Wi-Fi.

And that brings us to today. At Aruba, we’re pretty excited about 802.11ax. Well, the Wi-Fi Alliance renamed everything, so now it’s known as Wi-Fi 6.

802.11ax is designed to maximize capacity and efficiency—by as much as 4x compared to 802.11ac. It’s designed to enhance performance in the real-world performance, especially in crowded environments like city centers, apartment buildings, and university and corporate campuses.

It’s also designed to higher throughput for voice, email and other applications with short packets. And it enables the explosion of IoT, with support for long battery life and huge numbers of low-rate clients.

We have been writing a series of blogs about 802.11ax, which you should check out. A good place to start is to read Goals and Key Features of 802.11ax by Peter Thornycroft, who is in Aruba’s CTO group and actively participates in the Wi-Fi Alliance and the 802.11 standards body.

Read all of our coverage on 802.11ax.