Pages

Friday, August 12, 2011

Adopting Wireless Client Testing & Verification Procedures

As many professionals in the industry know, Wi-Fi clients cause the majority of issues on wireless networks. This is due to the ambiguity of implementation details for many features within the IEEE 802.11 specification. With the current 802.11-2007 standard clocking in at over 1,200 pages in length and the 802.11n amendment adding another 500+ pages, it's already tremendously complex. Yet it still does not define how some functions should be implemented in shipping product. This leads to varying interpretations of the standard, inconsistent behavior between different devices (and often-time between devices of the same model / part number), and at worst product incompatibilities. The Wi-Fi Alliance does an adequate job ensuring basic compatibility. However, inconsistencies between infrastructure and client devices still exist and can cause major headaches for corporate IT departments burdened with supporting the mobile device influx as well as standard corporate owned devices.

Organizations should adopt internal testing and verification procedures that certify all changes to infrastructure and client devices in a lab prior to rollout in production environments. These procedures and testing ensure that most wireless connectivity and performance issues will be identified in a lab environment and corrected prior to rollout. This will greatly improve network stability and reduce downtime.

In addition, since Wi-Fi utilizes a shared medium, understanding the capabilities and limitations of your client device base will enable network engineers and architects to design a solid solution that meets ALL client requirements in their environment. Unfortunately, this also means that the wireless network typically must be designed to support the least common denominator as far as client devices are concerned. Knowing device limitations will enable an organization to identify, prioritize, and appropriately budget for device replacement as well as integrate new requirements into device sourcing events.

Consider including the following types of tests in your procedures:
  1. Client Basic Association Test - assess the ability of the client device to establish a connection to the network using the applicable authentication and encryption methods approved for use.
  2. Radio Coverage Association Test - assess the ability of the device to re-establish a network connection after moving outside of the AP coverage area then back into the coverage area.
  3. Radio Strength Test - assess the RF signal strength and signal quality of the client device.
  4. Radio Interference Test - verify the ability of a device to remain associated to an AP during periods of interference at varying levels.
  5. Radio Scan Test - assess the ability for the device to scan only a subset of channels to improve active scanning performance and reduce latency during association and roaming.
  6. Radio Poor Signal Test - assess the ability for the client device to establish a network connection in an area with poor coverage to determine effective receive sensitivity and coverage boundary.
  7. Radio Roaming Analysis - assess the performance of the device when roaming between multiple APs using the applicable authentication and encryption methods approved for use. Hint - breakdown the roaming into several sections to determine where delays may be occurring (probing, association, EAP, EAPoL key). Also look for anomolies including de-authentication or dis-associations.
  8. DHCP Roaming Analysis - assess the client DHCP behavior and performance during network roaming (including DHCP renewal behavior).
  9. Application Behavior Testing - verify the ability for the application to function correctly under the given scenario, sometimes also referred to as User Acceptance Testing (UAT). This can also identify performance problems due to incorrect application design or development (typically due to strict application timers designed to work over low-latency wired local area networks, not wireless networks).
  10. Network WAN Latency Test - assess the client and application performance given varying WAN latency (if the RADIUS or application server is remote across a WAN).
  11. Network WAN Load Test - assess the client and application performance given varying WAN load (if the RADIUS or application server is remote across a WAN).
  12. Network WLAN Load Test - assess the client and application performance with varying levels of WLAN clients on the same access point or channel.
  13. Network WLAN QoS Test - assess the client and application support for QoS traffic classification, marking, and prioritization over the air using 802.11e and IP DSCP.
  14. Battery Life Test - assess the battery life of mobile devices under various load conditions, usage frequency, and battery ages.
  15. Radio Power Save Operation - assess the interoperability and performance of the device with various power save modes of operation (PS Polling, U-APSD, PS Multi-Poll).
  16. Device Sleep / Hibernation / Screen Lock Behavior - assess the behavior of the device after being placed into a sleep, hibernate, or screen lock state and brought back awake.
Document each test sceanario and include a detailed description of the test case, devices to be tested, test setup, test execution steps, results to be measured, measurement methods, expected outcomes (success criteria), and include a logical diagram of the test environment and execution.

Also, document the procedures in a clear, concise, and detailed fashion to ensure process repeatability. This will allow the organization to establish consistent testing processes, establish netwok and device performance baselines, and allow comparison of new results to historical results. Repeatable and well-documented procedures will also allow process handover to new staff members as roles and responsibilities change, as well as aid in network troubleshooting should the procedure need to executed by an untrained employee at a remote site in an emergency.

Upon execution of each test case, the following data points should recorded, analyzed, and included in test reports (may vary between tests):
  • Infrastructure hardware models, software  versions and configurations deployed (perhaps grouped into configuration "releases" similar to common software development practices)
  • Client device operating system version, supplicant used, software versions, and driver versions
  • Functional test scenario observation of results
  • Multi-Channel packet capture and analysis
  • Automated protocol analysis (includes high level wireless statistics, such as retransmissions, channel utilization, etc.)
  • Spectrum analysis
Execution and analysis of these tests will require investment in advanced professional-grade wireless LAN tools, specifically for packet capture, protocol analysis, and spectrum analysis.

Organizations should invoke these testing procedures in the following circumstances:
  • Wi-Fi infrastructure code upgrades
  • Wi-Fi infrastructure configuration changes
  • Device software upgrades (OS, driver, or supplicant)
  • New device deployments
  • New application deployments on wireless clients
A dedicated lab environment should be created to mimic the production environment as closely as possible. This will also allow testing to be performed in an isolated environment which cannot impact the production network.

Revolution or Evolution? - Andrew's Take
Defining, implementing, and regularly executing wireless network and device verification procedures can be time consuming but can drastically improve network uptime and performance. The initial cost of time, labor, and investment in professional tools will quickly be recouped through the elimination of support incidents, subsequent resource utilization and loss of business. As wireless networks continue to grow more complex and client diversity explodes with the "i-Everything" and BYOD trends, investing time up-front in client testing and verification will reap dividends with improved network stability and should reduce support expenses in the long-run. It will also make your network users (and management) happier!

Cheers,
Andrew


Other Posts You Might Like:

12 comments:

  1. Andrew,
    I notice that you suggest testing power save behavior but not QoS behavior for clients. I wonder if you don't regard this kind of testing, specifically for those types of devices that suffer from problems with WLAN latency or jitter such as voice devices, as being important or if you regard it as part of application testing? I would think this would warrant a different set of tests under the heading WLAN Latency Testing which is distinct from the others you have listed.
    Great article though, its really good to see a well thought out testing methodology discussed as I have seen far to many approach WLANs as 'throw it up and see what happens' rather than preforming needed verification and testing of the WLAN environment.

    ReplyDelete
  2. Chris,
    Your absolutely right, client QoS behavior should be tested. How could I have missed that?! I will update the post to include it.

    Thanks,
    Andrew

    ReplyDelete
  3. Hi Andrew,

    What is the best way to install Cisco 1130AG access point? In the cisco guide they didn't mention that antenna pattern shown on the vertical or horizontal, because people are arguing that the antenna pattern is shown in horizontal ceiling mount and vertical wall mount installation?!

    ReplyDelete
  4. So can understand azimuth is wall and elevation is ceiling mounted signal pattern?

    ReplyDelete
  5. And what is your offer for 3 meter height office?

    ReplyDelete
  6. To understand azimuth (horizontal plane) and elevation (vertical plane) diagrams see the following links:

    Cisco Antenna Patterns and Their Meaning

    CWNA Wireless Book

    Cheers,
    Andrew

    ReplyDelete
  7. Very useful, thanks

    ReplyDelete
  8. On the 1130 antenna pattern didn't mention that which direction is Y, Z, X. As my understanding Y is forward, mostly shown Y is 0degree on the guide (http://www.cisco.com/en/US/prod/collateral/wireless/ps7183/ps469/prod_white_paper0900aecd806a1a3e.html), when see the 1130 pattern , it is beaming little 0->60degree (as my understanding tilting down side 45degree) and most of patterns are directing back side (almost 180degree). Is it right?

    ReplyDelete
  9. Azimuth and Elevation diagrams describe antenna patterns, and typically assume that the antenna is mounted (or measured) in the orientation in which it will be used.

    For the 1130 series AP, that means ceiling mounted. Hence, the azimuth diagram is a "top-down" view showing an omni-directional coverage pattern. The elevation diagram is a "side-view" and shows the highest antenna gain between 120-240 degrees; since the intended orientation is on the ceiling, this equates to coming out the front plate of the AP (where the logo is) and radiating down from the ceiling into the room (toward the floor).

    Hope this helps!
    Andrew

    ReplyDelete
  10. What would be handy would be a process for automating as many of the 16 tests as possible. Some of them are implementation specific or subjective, but many of them could, with the right hooks into the device or the wireless infrastructure, be automated.

    ReplyDelete
  11. Thank you for the excellent article. It's nice to see a clearly documented blogpost containing all the test cases.

    My question is - If you conducted this test for a particular model of chipset and a driver version, would you still repeat the test for another device? Personally, I would repeat the test , just to see if there is any device specific behaviour.

    Regards,
    TacACK

    ReplyDelete