Outdoor wireless backbone networks, typically seen in cellular backhaul network architectures, are well-established. But they aren’t enough; private 5G networks, the Industrial Internet of Things and AI- and ML-enabled assembly line and other machinery are requiring more and more indoor bandwidth. Indeed, there’s a growing need for “indoor” backbone networks with multi-Gigabit capacity.
As for consumer/residential, more evidence of the need for increased indoor bandwidth comes from the latest “Mobility report” from the OEM Ericsson. More and more people are going completely wireless, and the concept of a “landline” is becoming as uncommon as a rotary-dial phone.
So, when Ericsson notes that the global total number of high-bandwidth 5G subscribers is about to reach 1.6 billion and that, as has been the case for years now, 80% of cellular connections originate or terminate from indoor locations, the need for increased indoor connectivity becomes quite apparent.
At the core of the 5G standard are reduced latencies and much higher data rates. It’s these higher data rates that are resulting in poor indoor penetration from macro cells deployed on towers. To achieve higher data rates within a given channel structure (i.e., the same as 4G) designers had the options of using higher modulation or more complex MIMO arrays.
Both of these require higher quality signals, which are more sensitive to signal degradation from penetrating exterior walls. Basically, the higher capacities supported by 5G result in poorer propagation and hence indoor penetration. And propagation versus data rates is now a core tenet—or trade off—for all wireless systems.
To address the need for better indoor cellular connectivity, carriers are looking at 5G mid-band TDD and indoor small cells—as solutions for significantly improving network performance in these locations.
Deployments of small cells will make Gigabit-speed indoor backbone networks even more of a necessity, and wireless would be the most efficient way to link them. This is seen particularly in conference centers, hotels and stadiums—where most, if not all, professional sports teams are offering fans a host of 5G-enabled amenities.
Recall that indoor networks were initially composed of Ethernet or a co-ax cable plant, and these cables were initially deployed for both backbone (e.g., riser cable systems in tall buildings) and access purposes. Eventually Ethernet won that “battle,” and RJ-45 jacks became ubiquitous on wall outlets and computers. Then in 1999, 802.11b Wi-Fi was released and entered the market as an indoor access solution and it was essentially a way for users to connect to a location’s backbone without wires. Indoor connectivity eventually evolved to include cellular connectivity, but again, for access purposes only.
Over the last 20+ years, Wi-Fi technology and standards have improved to the point that relatively high-performance and ease of connectivity are a given and, as an indication of that progress, today it’s almost impossible to find a laptop with a built-in Ethernet port—another sign of the shift to “all wireless” connectivity. There are now numerous wireless devices with nomadic and mobile capabilities and the availability of ever-increasing access speeds make wireless the preferred approach for all indoor connectivity.
But looking at indoor connectivity on a big picture basis, wireless has never been used for indoor backbones for a few reasons:
Performance: A backbone must have much higher capacities than what the sub-6 GHz access networks are feeding it
Spectrum: Achieving the best propagation and the ability to go through walls would require lower frequency-based systems. But, as noted above, all these lower bands are being used for access
Aesthetics: Lower frequencies are used for access and as a result, any wireless backbone approach would have to use upper bands, even millimeter wave bands, for the backbone. Until today, this required Installing and aiming “line of sight” indoor antennas and deploying unsightly radio equipment—and all of this is a non-starter. Furthermore, the requirement for strict LOS operation in an indoor environment was too restrictive in the home or business environment. Simply walking in between a client and an 802.11ad AP broke the connection.
However, there’s high-frequency bandwidth available that provides multi-Gigabits of capacity for modern indoor bandwidth needs and now has “non” LOS capabilities that enable the equipment to be discreetly and unobtrusively deployed—namely the 60 GHz (V band) spectrum. With a whopping 14GHz of spectrum, more than all the lower bands combined (e.g., sub 6 up to 42 GHz), the 60 GHz band offers plenty of bandwidth to achieve multi-Gigabit rates.
This is enough bandwidth to match the performance of CAT-6 or even CAT-8 Ethernet and advances in beam forming and system gain performance enable NLOS functionality. Beamforming involves the precise phase shifting of the elements of an antenna array to generate a very narrow beam focused in a very specific direction. The narrow beam greatly increases the gain as seen by the intended receiver, while at the same time reducing interference as seen by other devices in close proximity. Furthermore, the inherent short range of 60 GHz systems also contributes to the feasibility of NLOS connections as most indoor links are less than 100m and most range from 25 to 50m.
As a result, a whole new market has opened for services in the 60 GHz band and wireless indoor backbones are now being deployed in locations like automobile manufacturing plants, apartment complexes, concert or conference halls, and more. Based on the speed, convenience and lower costs of deploying and operating such systems and the performance results to date, it’s realistic to posit that someday they could be as ubiquitous as indoors Wi-Fi and eventually make laying CAT-X cables a distant memory.
Sources Materials from : Vivek Ragavan 02.16.2024, —Vivek Ragavan is CEO of Airvine CEO, EE times,
https://www.eetimes.com/using-60-ghz-for-new-indoor-wireless-backbone-networks/