When it comes to broadband, speed is king. With broadband access speeds rapidly racing past 100 Mbps toward 1 Gbps, Wi-Fi challenges are surfacing like never before. This will be compounded even further by the evolution of PON and emerging 10 Gigabit capable XGS-PON and NGPON2 technologies.
We talk a lot about speeds and feeds in the broadband access space. For example, “Gigabit” and even “10 Gigabit” technologies more recently with the eminent evolution of PON. The fact is however, none of that matters if the Wi-Fi is terrible. Wi-Fi has become the preferred method of access to the network. And to consumers, Wi-Fi and internet are one in the same, which is why they are increasingly judging the quality of their internet connection by the quality of their Wi-Fi experience. I see this in my own home. When there is an internet outage, children quickly announce, “Dad, the Wi-Fi is down.”
A large part of the challenges around Wi-Fi lie in the fact that it leverages unlicensed spectrum. Anyone can transmit within regulatory limits and there is limited spectrum available. This can cause reception problems. For example, in the 2.4 GHz Wi-Fi spectrum in the European Telecommunications Standards Institute (ETSI) regulatory domain, we have 13 channels. However, of those 13 channels, only three are non-overlapping or don’t interfere with each other (channels 1, 6 and 11). Given 2.4 GHz signals travel much further than 5 GHz, the 2.4 GHz spectrum is much more prone to interference and as such is often considered best effort.
The 5 GHz spectrum, considered primary with shorter range and less interference, fares a bit better with four channels to work with, all non-overlapping or non-interfering. Additional 5 GHz channels are available with the support of Dynamic Frequency Selection (DFS), however, before and during the use of DFS channels the channel must be abandoned if radar is detected, which impacts services.
Historically, Wi-Fi implementations were deployed by service providers in a best-effort model where they often felt that they either could not control, nor were they responsible for, that aspect of the customer experience. Alternatively, consumers often deployed the Wi-Fi themselves with gear from their local electronics store. Operators have since begun to realize that taking an ostrich approach (burying their head in the sand and hoping it would go away) will result in rapid increase of customer complaints and churn rates.
Regardless of whether the Wi-Fi equipment is provided by the service provider or not, the service provider receives the blame for the poor quality of experience and suffers the very real pain of subscriber churn. The only way forward is for service providers to pro-actively manage this challenge.
So what’s the answer? I often like to say you can’t fix what you can’t measure or monitor. That’s the first step. Service providers can leverage cloud-based management and monitoring tools to extend visibility into the home network and understand what exactly is happening. Measuring and monitoring, however, are not enough. The radio frequency environment is dynamic and ever-changing. Having the ability to measure signal strength, transmit rates, channel utilization, interference levels, etc. are all useful. But if it requires the customer to call in to report a problem, or human intervention is required to make changes to improve the user experience, operators can incur massive support cost increases.
Luckily, advances in self-optimization, machine learning, data analytics and artificial intelligence are all combining to deliver an automated approach to safeguarding the quality of subscriber Wi-Fi experience. Service providers need to take advantage of these proactive auto algorithms to solve these issues proactively before ever getting a call from the customer or having to perform manual intervention.
Radio Resource Management (RRM) or Self Optimizing Networking (SON) technology can automatically change radio settings such as channel and transmit power to prevent interference from negatively impacting Wi-Fi quality. Dynamic Steering technology can steer wireless clients to the best radio in the best access point to provide the best possible customer experience. For example, steering the client towards the cleaner, faster 5 GHz spectrum or preventing the client from sticking on to an access point (AP) that is further away when there is a closer AP would provide better performance. Airtime Fairness can also prevent older, slower clients from impacting newer, faster clients.
Given the prominent role that Wi-Fi assumes in the end user’s overall experience, it can also frequently be incorrectly blamed for performance issues that emanate from other causes. Active monitoring of non-Wi-Fi parameters such as CPU, Memory, Flash utilization and protocols like DHCP and DNS and their associated response times, as well as speed tests that distinguish between Wi-Fi and broadband throughput, can all help to quickly determine whether the problem is Wi-Fi-related or an issue elsewhere.
The above is only the tip of the iceberg when it comes to the potential that these insights can provide. Knowing which customers are exhausting their current broadband capacity, opens the door for targeted up-selling. Not to mention that understanding which customers are accessing more capacity than specified in their contract, reduces revenue leakage, while identifying which customers cannot achieve what they paid for so that operators can proactively intervene to minimize churn.
The challenges I describe are going to be faced by every operator in the world as their service offerings approach and exceed 100Mbps. The time to start planning for these scenarios is now - not when you have already invested heavily in marketing your new ultrafast service offerings, only to discover that disgruntled customers are undermining your efforts by complaining about poor experiences on social media. Therefore, it’s imperative to plan for the Wi-Fi experience you and your customers want, today.