Here at Ars, we’ve spent a lot of time covering how Wi-Fi works, which kits perform the best, and how upcoming standards will affect you. Today, we’re going to go a little more basic: we’re going to teach you how to figure out how many Wi-Fi access points (APs) you need, and where to put them.
These rules apply whether we’re talking about a single Wi-Fi router, a mesh kit like Eero, Plume, or Orbi, or a set of wire-backhauled access points like Ubiquiti’s UAP-AC line or TP-Link’s EAPs. Unfortunately, these “rules” are necessarily closer to “guidelines” as there are a lot of variables it’s impossible to fully account for from an armchair a few thousand miles away. But if you become familiar with these rules, you should at least walk away with a better practical understanding of what to expect—and not expect—from your Wi-Fi gear and how to get the most out of it.
Before we get started
Let’s go over one bit of RF theory (radio-frequency) before we get started on our ten rules—some of them will make much better sense if you understand how RF signal strength is measured and how it attenuates over distance and through obstacles.
The above graph gives us some simple free space loss curves for Wi-Fi frequencies. The most important thing to understand here is what the units actually mean: dBM convert directly to milliwatts, but on a logarithmic base ten scale. For each 10dBM drop, the actual signal strength in milliwatts drops by a factor of ten. -10dBM is 0.1mW, -20dBM is 0.01mW, and so forth.
The logarithmic scale makes it possible to measure signal loss additively, rather than multiplicably. Each doubling of distance drops the signal by 6dBM, as we can clearly see when we look at the bold red 2.4GHz curve: at 1m distance, the signal is -40dBM; at 2m, it’s -46dBM, and at 4m it’s down to -52dBM.
Walls and other obstructions—including but not limited to human bodies, cabinets and furniture, and appliances—will attenuate the signal further. A good rule of thumb is -3dBM for each additional wall or other significant obstruction, which we’ll talk more about later. You can see additional curves plotted above in finer lines for the same distances including one or two additional walls (or other obstacles).
While you should ideally have signal levels no lower than -67dBM, you shouldn’t fret about trying to get them much higher than that—typically, there’s no real performance difference between a blazing-hot -40dBM and a considerably-cooler -65dBM, as far away from one another on a chart as they may seem. There’s a lot more going on with Wi-Fi than just raw signal strength; as long as you exceed that minimum, it doesn’t really matter how much you exceed it by.
In fact, too hot of a signal can be as much of a problem as too cold—many a forum user has complained for pages about low speed test results, until finally some wise head asks “did you put your device right next to the access point? Move it a meter or two away, and try again.” Sure enough, the “problem” resolves itself.
Rule 1: No more than two rooms and two walls
Our first rule for access point placement is no more than two rooms and two interior walls between access points and devices, if possible. This is a pretty fudge-y rule, because different rooms are shaped and sized differently, and different houses have different wall structures—but it’s a good starting point, and it will serve you well in typically-sized houses and apartments with standard, reasonably modern sheet rock interior wall construction.
“Typically-sized,” at least in most of the USA, means bedrooms about three or four meters per side and larger living areas up to five or six meters per side. If we take nine meters as the average linear distance covering “two rooms” in a straight line, and add in two interior walls at -3dBM apiece, our RF loss curve shows us that 2.4GHz signals are doing fantastic at -65dBM. 5GHz, not so much—if we need a full nine meters and two full walls, we’re down to -72dBM at 5GHz. This is certainly enough to get a connection, but it’s not great. In real life, a device at -72dBM on 5GHz will likely see around the same raw throughput as one at -65dBM on 2.4GHz—but the technically slower 2.4GHz connection will tend to be more reliable and exhibit consistently lower latency.
Of course, this all assumes that distance and attenuation are the only problems we face. Rural users—and suburban users with large yards—will likely have already noticed this difference and internalized the rule-of-thumb “2.4GHz is great, but man, 5GHz sucks.” Urban users—or suburban folks in housing developments with postage-stamp yards—tend to have a different experience entirely, which we’ll cover in Rule 2.
Rule 2: Too much transmit power is a bug
The great thing about 2.4GHz Wi-Fi is the long range and effective penetration. The bad thing about 2.4GHz Wi-Fi is… the long range and effective penetration.
If two Wi-Fi devices within “earshot” of one another transmit on the same frequency at the same time, they accomplish nothing: the devices they were transmitting to have no way of unscrambling the signal and figuring out which bits were meant for them. Contrary to popular belief, this has nothing to do with whether a device is on your network or not—Wi-Fi network name and even password have no bearing here.
In order to (mostly) avoid this problem, any Wi-Fi device has to listen before transmitting—and if any other device is currently transmitting on the same frequency range, yours has to shut up and wait for it to finish. This still doesn’t entirely alleviate the problem; if two devices both decide to transmit simultaneously, they’ll “collide”—and each has to pick a random amount of time to back off and wait before trying to transmit again. The device that picks the lower random number gets to go first—unless they both picked the same random number, or some other device notices the clean air and decides to transmit before either of them.
This is called “congestion,” and for most modern Wi-Fi users, it’s at least as big a problem as attenuation. The more devices you have, the more congested your network is. And if they’re using the same Wi-Fi channel, the more devices your neighbors have, the more congested both of your networks are—each of your devices can still congest with one another, and still have to respect airtime rules.
If your own router or access points support it, turning your transmission strength down can actually improve performance and roaming significantly—especially if you’ve got a mesh kit or other multiple-AP setup. 5GHz typically doesn’t need to be detuned this way, since that spectrum already attenuates pretty rapidly—but it can work wonders for 2.4GHz.
A final note for those tempted to try “long-range” access points: a long-range AP can certainly pump its own signal hotter than a typical AP, and blast that signal a greater distance. But what it can’t do is make your phone or laptop boost its signal to match. With this kind of imbalanced connection scenario, individual pieces of a website might load rapidly—but the whole experience feels “glitchy,” because your phone or laptop struggles to upload the tens or hundreds of individual HTTP/S requests necessary to load each single webpage in the first place.
Rule 3: Use spectrum wisely
In Rule 2, we covered the fact that any device on the same channel competes with your devices for airtime, whether on your network or not. Most people won’t have good enough relationships with their neighbors to convince them to turn their transmission strength down—if their router even supports that feature—but you can, hopefully, figure out what channels neighboring networks use and avoid them.
This is usually not going to be an issue with 5GHz, but for 2.4GHz it can be a pretty big deal. For that reason, we recommend that most people avoid 2.4GHz as much as possible. Where you can’t avoid it, though, use an app like inSSIDer to take a look at your RF environment every now and then, and try to avoid re-using the busiest spectrum as seen in your house.
This is, unfortunately, trickier than it looks—it doesn’t necessarily matter how many SSIDs you can see on a given channel; what matters is how much actual airtime is in use, and you can’t get that from either SSID count or raw signal strength in the visible SSIDs. InSSIDer lets you go a step further, and look at the actual airtime utilization on each channel.
In the above inSSIDer chart, the whole 2.4GHz spectrum is pretty much useless. Don’t get excited by those “empty” channels 2-5 and 7-10, by the way: 2.4GHz Wi-Fi gear defaults to 20MHz bandwidth, which means a network actually uses five channels (20MHz plus a half-channel margin on each side), not one. Networks on “Channel 1” actually extend from a hypothetical “Channel negative two” to Channel 3. Networks on Channel 6 really extend from Channel 4 through Channel 8, and networks set to Channel 11 actually occupy Channel 9 through Channel 13.
Congestion is a much smaller issue with 5GHz networks, because the much lower range and penetration means fewer devices to congest with. You’ll frequently hear claims that there are also more 5GHz channels to work with, but in practice that bit isn’t really true unless you’re engineering Wi-Fi for an enterprise campus with no competing networks. Residential 5GHz Wi-Fi routers and access points are generally configured for either 40MHz or 80MHz bandwidth, which means there are effectively only two non-overlapping channels: the low band, consisting of 5MHz channels 36-64, and the high band, consisting of 5MHz channels 149-165.
We fully expect to see a bunch of contention over this in the comments: technically, you can fit four 40MHz wide networks or two 80MHz wide networks on the lower 5GHz band. Practically, consumer gear tends to be extremely sloppy about using overlapping channels (eg, an 80MHz channel centered on 48 or 52), making it difficult or impossible to actually pull off that degree of efficient spectrum use in realistic residential settings.
There are also DFS (Dynamic Frequency Spectrum) channels in between the two standard US consumer bands, but those must be shared with devices such as commercial and military radar systems. Many consumer devices refuse to even attempt to use DFS channels. Even if you have a router or access point willing to use DFS spectrum, it must adhere to stringent requirements to avoid interfering with any detected radar systems. Users “in the middle of nowhere” may be able to use DFS frequencies to great effect—but those users are less likely to have congestion problems in the first place.
If you live near an airport, military base, or coastal docking facility, DFS spectrum is likely not going to be a good fit for you—and if you live outside the US, your exact spectrum availability (both DFS and non-DFS) will be somewhat different than what’s pictured here, depending on your local government’s regulations.
Rule 4: Central placement is best
Moving back to the “attenuation” side of things, the ideal place to put any Wi-Fi access point is in the center of the space it needs to cover. If you’ve got a living space that’s 30 meters end-to-end, a router in the middle only needs to cover 15m on each side, whereas one on the far end (where ISP installers like to drop the coax or DSL line) would need to cover the full 30m.
This also applies in smaller spaces with more access points. Remember, Wi-Fi signals attenuate fast. Six meters—the full distance across a single, reasonably large living room—can be enough to attenuate a 5GHz signal below the optimal level, if you include a couple of obstacles such as furniture or human bodies along the way. Which leads us into our next rule…
Source: ARS TECHNICA | By Jim Salter| February 23, 2020 | https://arstechnica.com/gadgets/2020/02/the-ars-technica-semi-scientific-guide-to-wi-fi-access-point-placement/
To learn more about Wi-Fi, contact us today!
Micro Tech Resources | 5700 Stoneridge Mall Road, Suite 285, Pleasanton, CA, 94588