Different handheld mobile devices show a wide range of behaviors on Wi-Fi networks. Differences can be traced to assorted radio chipsets, the way drivers are written, the frequencies supported and the number of antennas used, to name just a few variables.
VeriWave, a wireless test vendor, has Wi-Fi feedback from enterprise customers using its WaveDeploy application introduced last year. WaveDeploy measures a range of metrics that are intended to show the quality of the end user's wireless experience, not just raw signal strength.
WI-FI EXPLOSION: Wi-Fi client surge forcing fresh wireless LAN thinking
One WaveDeploy customer, a hospital, ran a series of tests on one floor of a new wing, with an 802.11n WLAN deployed. A wheeled car was loaded with two laptops, from Dell and HP, three tablets (Apple iPad, iPad 2 and Motorola Xoom), and two iPhone 4's, one each from AT&T and Verizon Wireless. The hospital wanted 10Mbps throughput for the laptops, and 7Mbps for the handhelds.
The results, which are specific to this particular WLAN, were dramatically different, as the linked color-coded heat maps show. Green shows the device meeting the target throughput; the deeper the shade of red, the further from that target. The blue numbers only identify the spots where the test was run.
The HP laptop performed notably better than the Dell, showing green (9.0Mbps to 10.0Mbps) in nearly all of the test locations. By comparison, the mobile handhelds consistently had much lower throughput.
The iPad was consistently under 2Mbps, often around 1.65Mbps. The Motorola tablet was somewhat better in some locations.
The iPad 2 was much more successful, more often achieving the target 7Mbps.
The iPhone 4 models showed startlingly different behaviors. In nearly one-third of test locations, the AT&T iPhone lost its Wi-Fi connection and didn't reconnect. In only a couple of places did it come close to 4Mbps; for the most part, it was under 3Mbps.
The Verizon iPhone 4 maintained connection in all locations save one; but achieved only slightly better throughput.
"Most wireless LAN designs are for [optimal] RF signal coverage," says Eran Karoly, VeriWave's vice president of marketing. "But RF power doesn't necessarily correlate with areas of poor coverage. You need to look at the actual throughput seen by the user with that particular device."
These kinds of test results can be used to set IT and end user expectations, and service-level agreements, for Wi-Fi performance, and identify areas where throughput can be improved by moving or adding access points. Device types or models can be segregated to one or another frequency or specific SSIDs with different quality of service metrics.
"It's not all about the network [infrastructure]," Karoly says. "It's about the network plus the client. Clients affect performance and you have to measure performance on a client-by-client basis." [see "Tips for navigating the evolving wireless LAN landscape"]
Read more about anti-malware in Network World's Anti-malware section.