2/16/2019: 5G Wireless Standard

This is completely asinine.

Recently, John Donovan, AT&T CEO extraordinaire, struggles to explain his company’s technology claims, and insists that his competitors are just jealous because they aren’t pioneering 5G standards the way AT&T is. What’s true is that AT&T is trying to improve the things people care about in their mobile phone services – speed, reliability, reduced latency – those things. But what Verizon and T-Mobile are bitching about is the fact that AT&T is branding their advancements as true 5G when in reality, they are not standards based at all. What’s confusing is that AT&T is developing what could be called a 5th generation cellular service – what’s actually being termed 5G Evolution, in fact – but it does not represent the standards put forth by the 3G-double-P. At the end of the day, it’s really just an enhancement upon 4G. (By the way, the 3G-double-P is the 3rd Generation Partnership Project, a consortium of groups of telecommunications standards organizations operating at a global level.)

AT&T is heavily marketing their 5G EVO service to their mobile phone subscribers. Now, on a side note, Verizon is developing their own 5th generation wireless services geared toward home networking, things like IoT devices which we’ll talk about more here shortly. But these advancements also aren’t completely standards-based either, so one could say Verizon is a bunch of hypocrites if they’re ripping on AT&T. But in their defense, they’ve been smart enough to provide a migration path for adoption of official 5G wireless standards in the future once it is officially ready for commercial adoption. (By the way, I think it’s important that people understand the value of standards in the realm of technology. When you don’t adhere to standards, the risks increase, and operations fail.) Standards people. Standards.

Although I haven’t seen it personally, a person can supposedly buy a 5G device and get onto AT&T’s 5G network in 12 cities across the US with an additional 7 cities coming soon. I know there have been sporadic outages to cellular 4G services in cities like Chicago recently where AT&T screwed up a configuration related to their towers causing everyone with an AT&T phone to lose cellular access for several hours. This is because the next generation of 5G services – whether we’re talking about AT&T fake 5G or the official 5G standard – will be anchored in 4G frequencies until a future time when technology has caught up with the vision.

Ultimately, this discussion starts with the 3G-double-P. 5G was developed by the 3rd Generation Partnership Project, which is essentially a big-time group of standards organizations operating across the globe. The organization was officially incorporated in December 1998 and have been responsible for the development of standards related to the operation of wireless technologies we’ve used over the last two decades ever since. These guys are seriously badass. They should make a movie about them.

Before we move any further, perhaps we need to get back to basics and explain how wireless cellular technologies have evolved over the years. It began with 1G, which was a voice only service capable of 2.4Kbps. Those were the days of car phones, twelve-pound mobile handsets with large telescoping antennas, and Don Johnson sporting pink flamingo shirts near the beach in Miami. 2G came next in the 90s, using GSM and CDMA standards, to provide SMS text functionality. That’s when the days of 3-way calling, and eight-hour phone conversations began to die. Good times. Then, who else but AT&T created 2G evolution, otherwise known as 2.5G, 2.75G, or EDGE, which of course, didn’t adhere to any industry standards because AT&T is too good for that shit. This is when your tiny flip phone was finally able to load sports scores if you were patient enough to wait ten minutes for 33 barely readable ASCII characters to render on your display. But hey, phones were finally receiving data, it was totally rad. Then came 3G and the age of modernity circa 1998 with speeds up to 2Mbps for non-moving devices and mobile phones beginning to resemble what we know them to be today. (By the way, to put things into perspective, most of us didn’t have our first 3G capable phone until several years later. These technologies always hit the market a few years after the establishment of a new standard.) Then AT&T with a few more interim releases, and 4G hits around 2008 and still hasn’t been fully realized in rural areas across the United States. Anyway, it seems like a general rule that you’ll see a new standard every ten years, and 5G is now supposedly right upon us. If the trend continues, maybe we can expect 6G in 2028, which I assume is the standard to bring us teleportation capabilities. In Star Trek terms, that’s about right considering James Tiberius Kirk was born on March 22, 2233.

5G NR, or New Radio, is a new air interface specification developed by the 3GPP for 5G mobile networks. The study on 5G NR began in 2015 with the first official specification released at the end of 2017. The study ultimately identified 70 different use-cases for 5G wireless cellular technologies, which they organized into four main groups which kind of summarize where this is going. The one that consumers are probably anticipating most is the enhancement of mobile broadband services to provide higher data rates, greater coverage, and more user mobility than ever before. Another focuses on reducing latency, and improving reliability and availability for industrial control applications and tactile internet services. Then there is IoT – the massive Internet of Things… the networked security toasters, if you will. And to round it off were use-cases for enhancements to legacy wireless standards, because essentially, 5G and 4G are meant to form a symbiotic relationship in the short term.

Until recently, all 3GPP specifications have pertained to the existing licenses spectrum, which has inhibited the advancement of commercial acceptance for the standard, but that is all about to change. I mentioned 5G and 4G enjoying a symbiotic relationship, so to speak. Here’s why. With the finalization of the 3GPP Release 16 specification, expected to be finalized and released at the end of this year, there will be support for new unlicensed frequency bands to allow for compatibility with 4G infrastructure. This is called non-standalone licensed assisted access NR-U, and you can consider it phase one of the 5G rollout. The goal here is to boost existing deployments to offer better user experience with higher speeds by aggregating unlicensed spectrum with licensed spectrum. An anchor, if you will. It’s symbiotic from a 4G perspective because the 3GPP has mandated that improvements have to be made to existing infrastructure to make this possible. This phase does not include millimeter-wave technologies, which actually show up later. That’s termed stand-alone NR-U, which marks the expansion of cellular technologies into the unlicensed spectrum without anchor, to be exploited by use-cases for local private 5G networks servicing industrial IoT or enterprise mobile broadband applications. You’ll inevitably see the emergence of neutral host service providers in public venues like sports stadiums and malls once this hits the mainstream, but as we’ll find, there are a lot of obstacles and concerns that stand in the way of the adoption of a full stand-alone NR-U specification.

I’m sure every new wireless specification had its challenges when it was first developed, but 5G is on a whole new level. Everyone has to realize that this is supposed to be more than just a mobile cellular technology; this is being marketed as the medium to enable the Internet of Things. While speed increases and improved reliability are inevitable for mobile phone users, an IoT world where everything is sharing data with everything else presents some big obstacles. A 2017 Gartner report projected about 20.4 billion IoT devices will be connected by machine-to-machine networks by next calendar year. The number of connected devices is massive, and projected need dwarfs the scale of planned 5G architectures. Management of the state information for all of these devices is a problem too, and the seamless interoperability and heterogeneity between IoT networks necessary to make it possible presents significant privacy risk. Functionally, for proper device-to-device communication, spectral interference management needs to be improved. There are simply too many sources of radio interference and not enough strategies to mitigate them. Finally, a software-defined network is crucial for the interoperability of 5G IoT devices, but SDN still demonstrates technical limitations such as how to properly segment the data and control planes in these circumstances.

mmWave is a strange thing when you try and conceptualize what it means for the Internet of Things. Daniel Gleeson, a consumer technology analyst at the London-based digital consultancy Ovum, puts it into perspective. He says, quote, “It is going to be very expensive. Every part of it, from licensing the spectrum on day one to building out small networks in urban areas and eventually bridging that digital divide.” He’s right. Millimeter wave is a very short-range, high speed frequency band. You can encode an awful lot of data into those radio waves, but it can’t travel far, and it can’t penetrate walls or other obstructions very well. You gotta have that penetration!

You’re likely going to need a new 5G compatible phone to access the 5G network. Then again, AT&T and their bastardized 5G EVO might provide more convenient options. Some vendors are experimenting with modules that can make your 4G phone compatible with the new standard, but who knows how that’s going to turn out. It’s an experiment that could end up as bad as the herpes monkeys in Florida. Whatever may be, 5G networks are only going to be available on a limited basis in bigger metropolitan areas for the near future, so for most of us, it won’t be an immediate concern.

There is a lot that the 3G-double-P has to think about in terms of security. The future 5G wireless implementation has to consider Identity, for both users and IoT devices; authentication mechanisms to validate them; assurance and integrity of data; key management and cryptographic algorithms for privacy; mobility and signaling technologies, which have traditionally shows fatal security flaws, protection of stored data, whether stored locally or in the cloud, and backward compatibility with legacy infrastructure which wasn’t designed with 5G requirements in mind.

Experts seem confident that 5G technology is very secure. (That comment always makes me wonder if they’re really experts at all, but I digress.) The biggest concerns related to security and data privacy apply to the application level in terms of what your apps are doing and who you’re sending your data to, and how those entities are managing your data. It’s good that over the last few years there’s been a lot of scrutiny of companies like Facebook and a lot of discussion about how to properly manage privacy. Europe did something remarkable when they instituted GDPR, and it sucks that the US still hasn’t got a clue. We have to have these conversations and build a consumer competency for being vigilant about the operations of the apps we use.

There are really three main security concerns from my perspective. First of all, a 5G IoT world requires support for billions of IoT devices. Again, we’re talking about projections of over 20 billion wireless clients within the coming year. This presents a very broad attack surface for threat actors. The impact of cyberattacks on this infrastructure is likely proportional to the reliance society has upon an IoT-connected world. Second, IoT devices are inherently insecure and consumers are not security savvy. For instance, your smart lightbulb probably comes pre-configured with default passwords that people aren’t knowledgeable enough to change, nor do they observe proper personal security practices. Maybe you heard about the recent story of a family who woke up to their young baby crying in a 90-plus degree bedroom where an intruder had gained access to their Nest thermostat and increased the heat in an attempt to hurt the child. Nest claims they investigated and there is no evidence of a breach, and I actually believe them. The thing is, we all know that people are re-using passwords across multiple sites, including Nest’s web portal, which is plain stupid. Most people don’t enable multi-factor authentication. And now – sidenote – just the other day, news came out that with the next generation of GPUs, benchmarks for cracking complex 8-character password hashes in offline attacks will take only 1.25 hours on average with a maximum of 2.5 hours for any possible character combination. People need to up their personal security game. Get your granny on-board with this. The third and final thing to mention here is that a large number of applications still transmit personal location or status information without consumers’ consent. In an IoT connected world, sensitivity to the public availability of personal identity and location information will become a bigger concern and we haven’t fully identified ways to deal with this yet.

Dr. Joel M. Moskowitz, Director for the Center for Family and Community Health at the University of California at Berkeley published a very eye-opening report about the concerns associated with 5G radio frequencies and mmWave technologies in November 2018. The report reads very objectively, and there is a lot to be concerned about, especially with the mmWaves.

mmWaves are mostly absorbed within 1 to 2 millimeters of human skin and in the surface layers of the cornea, making the skin or near-surface zones of tissues the primary targets of the radiation. Because the skin contains capillaries and nerve endings, the bio-effects can be transmitted to other parts of the body by the skin or nervous system. Heating effects occur when the power density of mmWaves is above 5-10mW/cm2. Effects begin with a heating sensation and are followed by pain and physical damage at higher exposures. Temperature elevation associated with mmWaves can impact the growth, morphology, and metabolism, and can damage DNA. However, current FCC guidelines will ensure exposure to 5G-radiation will be low-intensity, and will therefore be limited to non-thermal effects produced by prolonged exposure to mmWaves and other sustained exposure to low- and mid-band radiofrequency radiation. Low-intensity is a misleading term. It isn’t good. A US Army and Air Force study published in 1998 found that 30-80% of healthy examinees perceived or exhibited effects associated with low-intensity MMWs.

Currently, FCC regulation established in 1996 defines the maximum permissible exposure to 1.0mW/cm2 averaged over 30 minutes for frequencies that range from 1.5GHz to 100 GHz. These guidelines were not designed to protect us from nonthermal risks that may occur with prolonged or long-term exposure to radiofrequency radiation. Though the impact is characterized as “low-intensity,” the effects of prolonged exposure of non-thermal mmWave radiation are unknown. Furthermore, no research has been conducted upon the prolonged impact of non-thermal effects of mmWaves accompanied by the radiation attributed with low- and mid-band radio frequencies.

The effects of mmWaves have actually been studied for decades, but most have been inconsistent due to factors such as frequency, modulation, power density, and duration of exposure (and none have dealt with the continuous exposures that the 5G infrastructure will demand). Results of those studies vary, but here are the general findings. First, mmWaves have been shown to induce or inhibit cell death and enhance or suppress cell proliferation. Secondly, a large number of cellular studies have indicated that mmWaves may alter structural and functional properties of membranes; exposure to MMWs may affect the plasma membrane and water molecules play a role. Thirdly, skin nerve endings are a likely target of MMWs and serve as a starting point for numerous biological effects, and finally, MMWs may activate the immune system through stimulation of the peripheral neural system. Of course, activation of the immune system isn’t necessarily a good thing when it starts attacking healthy tissues and organs.

Yeah, all these health concerns are a buzzkill, but here’s the one that freaks me out the most. You might want to sit down and grab a beer for this. A 2016 review of research on the effects of MMWs on bacteria found that the radiation can alter communication of bacteria and microbes which are thought to communicate by electromagnetic field of sub-extremely high frequency range. Effects were non-thermal and depended on different factors. Essentially, MMW interaction with bacteria changes their sensitivity to biologically active chemicals, including antibiotics, and is possibly a cause for the emergence of drug-resistant strains of bacteria. We’ve been talking about the rise of drug-resistant bacteria strains for years now, and the situation is getting worse… it makes you wonder how much our existing wireless technologies play into this phenomenon, and if it truly is a negative impact, does mmWave technology spell the end of civilization? Will bacteria finally overtake us because of our futuristic vision of smart cities and self-driving cars?