Search

Recent Posts

Posts by Topic

see all
free-on-demand-webinars.png

Putting Mojo in Your Benchmark Testing

by Robert Ferruolo (Dr. RF) on Jun 29, 2016

So Many Variables, So Difficult to Control

Wireless testing is hard. Any variable can change the results. This post is for those of you who are interested in WLANs and comparing the myriad AP performance tests published by AP vendors, third-party test labs, universities, and anyone who has an AP and a client.

The Mojo Methodology

I‘m not casting aspersions on other’s tests results. For the most part, they try to be fair. The point of this post is to outline the methodology Mojo uses to produce fair and reproducible results. Our philosophy is to test all APs in accordance with the manufacturer’s published, recommended best practices.

Fair and reproducible testing is a good start, but not sufficient. As fair as we are, we know there will be doubt, so we are being open and sharing our test methodology, just as we will share our test results.

Here are the rules we follow, and recommend anyone follow, when doing AP performance testing.

Use Cases

Each test is modeled on a real-life use case, such as:

  • Downstream Throughput - a person downloading a large file.
  • Multiuser video streaming - 30 students in a classroom watching video.
  • Multiuse mixed use - employees doing daily business, such as concurrent data (email, file downloads), voice (phone calls), and video (webinar, video conferencing).

Environment & Infrastructure

  • All tests are run in open air.
  • Before testing, monitor the channels to find the one that is not being used.
  • During tests, monitor the channels with both packet capture and spectrum analyzer.
  • Use of an Ethernet switch that is powerful enough to not be a bottleneck.

AP Configuration

  • All APs in a test use the same channel.
  • APs are placed in the same location, with the same orientation.
  • APs are powered via POE+ (802.3at).
  • Transmit power is set the same (e.g. auto or full, depending on the test).
  • We follow the vendor’s published best practices for the type of test (e.g. performance).
  • The running-config for each AP is collected once the configuration is set.

Client Configuration

  • The clients used in a test remain in the same position for the duration of the test.
  • The client is oriented as it would be commonly used.
  • The client configuration is recorded: client make and model, OS version, WLAN driver version, WLAN settings.

Tools Used

We use industry standard tools and settings.

Type Name
Performance Ixia Chariot
Video Streaming VLC
Wireless Sniffer Wireshark on MacBook Pro
Spectrum Analyzer MetaGeek Chanalyzer Pro/Wi-Spy DBx

Methodology

  • Every effort is made to limit the variables and minimize any change during the test. The only thing that should change during a test is the access point under test (APUT).
  • All variable settings are recorded: channel, channel width, test script used, distances, signal strength, etc.
  • Each test is run three times, and the average of the three runs is reported.
  • Once testing starts, all changes that occur in the environment, setup, configuration, or procedures will be noted.
  • The only change made during the test is swapping out the AP to test.

Unexpected Results

If a test produces an unexpected result, we review the configuration, the setup, and the environment, by doing the following:

  • Review the spectrum analyzer trace for channel interference.
  • Review the configuration to verify it is in line with best practices.
  • Validate the environment via a spot check with a different AP from same vendor.
  • Rerun test of APUT and compare results.
  • If rerun results are in-line with expectations, continue to retest and use the new results.
  • If the rerun results are consistent with the unexpected result, we will report the results.
  • We will attempt to determine the root cause of the unexpected behavior by reviewing the sniffer trace.

Stay tuned for more on benchmark test results.

And perhaps you’d like a no-obligation demo of our access points and cloud service? 

Request Demo

Topics: WiFi, WLAN networks, testing