The necessity of testing 802.11-based networks

May 1, 2009
The question is: Does 802.11n change the game? The answer: Absolutely.

The question is: Does 802.11n change the game? The answer: Absolutely.

Cabling installers who have grimaced at the complex process of fully testing Category 6A systems can take heart in this: the alien-crosstalk testing scenario has nothing on the tests and measurements involved in commissioning a WLAN.

Professionals who have installed WLAN systems can vouch for the complexity and time consumption involved in assuring, to the extent possible, they will function as desired. But like the transport media themselves, testing wired and wireless infrastructure environments is a study in contrasts.

The YellowJacket Tablet from Berkeley Varitronics Systems is a spectrum analyzer that is dubbed a “BANG” device because it is compatible with the following four 802.11 protocols: a, b, g, and n.

Click here to enlarge image

“A major difference between the wired and wireless worlds is that there are no standards for wireless testing,” says Carolyn Carter, wireless marketing manager for Fluke Networks (www.flukenetworks.com). “Integrators who have installed wireless systems for a long time have established rules of thumb, such as if the signal strength is at a certain level, you're OK. But there are several big questions: What do you test? When do you test it? Everybody does it slightly differently and everyone seems to have their own rules of thumb; some agree and some do not.”

Applying rules of thumb

Carter adds that while there is no formal consensus concerning the tests that should be performed or the results that would be considered acceptable, the issue of interference is an overriding concern: “You can design and deploy the best wireless network in the world, but if you have interference issues, the network may not run properly. An interference issue can wipe out a whole network and it could be something as simple as a microwave oven.”

Scott Schoeber, president and chief executive officer of Berkeley Varitronics Systems (BVS; www.bvsystems.com), concurs: “The variability or uncertainty of a wireless network's performance has to do with the nature of where the installation is taking place. In some environments, if only a few access points are being installed, interference is not a big problem. But issues do come up often when a larger number of access points—say 6 to 10 or more—are being installed.”

Schoeber adds, “There's a lot of potential for radio-frequency interference. Security cameras have interference potential; Bluetooth is a problem in some areas. Baby monitors can cause interference. The problem has grown in certain environments” that are populated with RF interferers.

Wade Williamson, director of product management for AirMagnet (www.airmagnet.com), makes it unanimous that interference is a significant concern. “In a high-rise, for example, if several wireless networks exist, you have to be mindful of your neighbors,” says Williamson. “It forces the user to look at things more than purely topologically in terms of network deployment. You have to take out the map of your building and ask how it looks in three-dimensional space.”

In addition to interference, the dynamic nature of a WLAN makes it “probably the most confounding issue to people coming in from a wired background,” Williamson adds. “We spend quite a bit of time working on it. The wireless LAN will not stay the same. WiFi tends to vary quite a bit; it worked fine yesterday but today it's not, for some reason. It's due to the fact that the people you're serving are moving around all the time. Your neighbors are as well.”

Williamson continues, “Interference sources can do the same. Even infrastructure vendors are building in tools that allow channel changing or balancing in cases of overloading. Being able to see what has changed and what has happened in terms of an overall ‘Doppler radar' type of view helps to determine if it's something that needs manual correction.”

Pre- and post-installation

In the wired world, system testing follows the infrastructure's installation. With wireless, it is folly to wait until the access points are in place. “You have to conduct a pre-audit of the airspace using a spectrum analyzer,” says BVS' Schoeber. The company's YellowJacket product is one of several analyzer products it offers for this purpose.

“Look at potential interferers and analyze the waveform to get a feel for how well or bad your system may perform,” says Schoeber. “Once you have done the audit, put some access points in place and compare them to your neighbors. Some users find they pick up their neighbors' access points better than they pick up their own.”

Fluke Networks' Carter recommends users go through a checklist of questions before mapping out their WLAN:

  • What type of deployment are you looking for?
  • How many users?
  • What type of applications?
  • Is it a network of convenience or a mission-critical network?

“When it's not mission-critical, users probably won't conduct a full design or site survey, or very thorough design of deployment,” Carter explains. “They'll use the more generally accepted customs. In most areas, you can get away with that if you're surfing the Web and reading e-mail, not running mission-critical applications.”

And if they are planning to run mission-critical traffic over their WLANs? “Typically,” says Carter, “they'll use the infrastructure vendor's software or full-fledged site-survey software.” Such packages allow users to import a map of the building, including characteristics such as walls, staircases, elevators, and other structural spaces. Users can then indicate the type of access points they plan to use, including antenna type and other information. The software places the access points and comes up with a predictive-model design.

“Some users were initially nervous about using predictive-model software, but it has proven itself out,” Carter says.

“Do a survey”

Fluke Networks, AirMagnet, and other vendors offer predictive-model software packages. AirMagnet's Williamson explains, “We have two ways of approaching the planning-and-design side of wireless LAN. One is predictive, based on building materials like drywall, brick, glass, etc. We'll give an 85- to 90-percent performance look. But we recommend to anyone getting into WiFi: do a survey. If you have an area where signal strength looks fine but packets are getting dropped, you're going to have cranky users. We want to lay that kind of thing out and be able to tell you that you went from 50 Megabits to 5—show it in a spatial sense in addition to standard network metrics.”

Williamson is referring to his company's AirMagnet Survey product, which is meant to be used pre- and post-deployment. “It's helpful for optimizing the network post-deployment,” he explains. “Some users conduct quarterly surveys. The map feature shows changes between what the network's original performance was and what it is currently.”

Fluke Networks offers the InterpretAir software package for similar functions. And Carter agrees with the pre- and post-deployment surveying, adding that testing in the midst of the installation is another piece of the puzzle: “Testing should occur after the equipment is installed but before it's turned on. There are multiple testing methods. Many use the signal-strength measurement and know that -55 dB performance will support most applications. Many can rattle off that a certain signal-strength measurement will mean the network can handle X number of users performing certain applications.”

Carter adds, “Some users are beginning to think about throughput measurements, but today, there is not a way to conduct such tests very well. That's where standards could help us.”

11n a game-changer

To recap so far, it is clear that interference sources often present a major snafu to WLAN transmission. Pre-, para-, and post-installation test and measurement techniques are less clear, given that no standards currently address the issue. But tools such as site-survey software packages and spectrum analyzers are generally accepted and widely used for this function.

One current issue that rings unani- mously true among the group of vendors interviewed for this article is that the advent of 802.11n wireless networking is changing the game for surveying and testing WLAN deployment spaces.

802.11n “has made this type of surveying and deployment almost mandatory,” says AirMagnet's Williamson. “802.11a, b, and g were fairly predictable in that unless you had some multipath or phasing issues, you could make estimations based on signal level and signal-to-noise ratio. In the 11n world, that has changed quite a bit. The performance jump from 50 to the theoretical 300 Mbits did not come from stronger signal, but rather from multiple-stream modulation.”

Williamson adds, “You don't know just by walking around passively or looking at a building map what your performance is going to look like. A lot of that plays out between the capability of the client and the 11n access point. Users can have a significant variance in terms of what the performance might be; it could be anywhere from sub-30 [Mbits/sec] to the high hundreds. That has required people to take this physical approach to a survey.”

According to Fluke Networks' Carter, 11n is emboldening some enterprises: “Users are considering running mission-critical applications over 11n. Some are said to be planning wireless-only facilities. When you go mission-critical, your wireless network has to be reliable all the time. You cannot simply do a one-for-one replacement of a/b/g devices with ‘n' devices. You really have to redesign.”

Williamson adds that an analysis tool is something of a flagship in 11n deployments: “It's the tool to use when you are auditing the network. It looks at wireless traffic in real time and is essentially a very intelligent wireless sniffer. The tool will state the client's capabilities, what's happening in real time, how to change settings on access points.”

More complex measurements

As for how widespread 11n is now and may become, BVS' Schoeber believes, “not everyone's there yet. It really depends on the application and need. If the user needs security and speed, ‘n' makes sense. It also makes sense for applications like video and music. It comes at a cost, of course, so the user has to make a decision. And the price point is starting to become more attractive. 11n requires more measurements to be made, and those measurements are more complex.”

Schoeber adds, “It looks like 2010 will be the year when the 802.11n standard will be officially ratified. But many manufacturers are using existing chipsets and selling equipment. It has become the de facto standard.”

He concludes, “Once the specification is ratified, it will be a motivation for users to deploy mission-critical applications over it. Certain environments will have only ‘n' devices, and that will require pre-installation study as well as monitoring. There is and will be a great deal of emphasis on 802.11n for enterprises.”

PATRICK McLAUGHLIN is chief editor of Cabling Installation & Maintenance.

Sponsored Recommendations

imVision® - Industry's Leading Automated Infrastructure Management (AIM) Solution

May 29, 2024
It's hard to manage what you can't see. Read more about how you can get visiability into your connected environment.

Adapt to higher fiber counts

May 29, 2024
Learn more on how new innovations help Data Centers adapt to higher fiber counts.

Going the Distance with Copper

May 29, 2024
CommScopes newest SYSTIMAX 2.0 copper solution is ready to run the distanceand then some.