Wi-Fi Alliance® has introduced three new enhancements to security of Wi-Fi, namely WPA3, Enhanced Open and Easy Connect.
How often do you say “Wow, this WiFi is great!”? WiFi is like a utility, you take it for granted until the lights don’t turn on or water doesn’t come out of the tap. Just like the electrical grid or the water infrastructure, WiFi takes planning to implement correctly and maintenance to keep running smoothly.
The great news is that WiFi keeps getting smarter and Mojo is leading the way with Cognitive WiFi™. An example of our dedication to excellent user experience is how the C-130 uses its third radio and Dynamic Channel Selection (DCS) to quickly, reliably, and automatically detect disruptive interference.
We recently performed a benchmark test to see how well access points avoided channels with high WiFi and non-WiFi interference on boot up and during operation. We evaluated how well the AP avoided interference and how user experience was impacted.
The Mojo C-130 was the only access point to avoid interference 100% of the time, on both boot up and when introduced on the operating channel. All other solutions failed to avoid a channel with a constant interference source that made the channel unusable, or failed to change channels when the channel utilization got so high that it severely impacted the user experience.
User experience was evaluated using the following quality score rating system:
Why don’t most enterprise WiFi access point vendors tell you what’s inside their AP? They don’t publish which WiFi chipset the AP uses, or the CPU specification. At best they state the amount of RAM. When you evaluate APs for your deployment, you should consider hardware components. Hardware components and the software running on it will impact the AP’s performance and user experience. The test results below demonstrate this.
While doing research on the Ruckus website for the R710, I noticed the statement of “Up to 2 times extended range and coverage with Ruckus BeamFlex technology.” Challenge accepted! To evaluate this claim we used a distributed client test, which determines the AP’s downstream performance when its clients are spread near and far, from excellent to marginal signal strength and points in between. This test simulates the performance of the AP in a typical enterprise, carpeted environment.
When is the last time you said: “Wow, this WiFi is great!”? You don’t really notice it when it works. You are more likely to say: “This WiFi is crap” when it doesn’t meet your expectations. WiFi is no longer a convenience, it’s an essential utility like electricity. You would like it work every time and without hesitation, like turning on a light.
Like the power grid, one of the biggest challenges in designing a wireless network is capacity planning. The goal of capacity planning is to determine how many access points are needed to provide a good user experience. Deploying too many APs is a waste of money and can make performance worse, but deploying too few will cause user experience problems (the equivalent of brownouts) when an AP becomes oversubscribed.
Surprisingly, we have been receiving a lot of social media chatter from Meraki folks about our latest performance testing comparing the Mojo C-120 to Meraki’s MR53, two leading 802.11ac Wave 2 access points.
In a recent blog post we compared the performance of the Mojo C-120 to the Meraki MR42. In that blog we highlighted results of a test we ran last spring. When we test, we test the best of the competition with the latest software and published best practices at that point in time. When that test was run, the MR42 was the best Meraki had to offer. Once Meraki made the MR53 available, we tested it and here are the results.
When we do competitive performance testing, we expect the premiere APs from our competitors to be in the same ballpark. We were quite surprised at the poor showing of the Aruba IAP-325 in the 50 client, mixed application test. The Aruba IAP-325 performed on par with the Mojo C-120 for the video and voice clients, but at the expense of the data clients where only 40% met the 1 Mbps minimum data throughput standard.
Wireless testing is hard. Any variable can change the results. This post is for those of you who are interested in WLANs and comparing the myriad AP performance tests published by AP vendors, third-party test labs, universities, and anyone who has an AP and a client.