5G stands for fifth-generation cellular wireless, and the initial standards for it were set at the end of 2017. Let us take you down the 5G rabbit hole to give you a picture of what the upcoming 5G world will be like.
First of all, if you’re hearing about 5G Wi-Fi or AT&T’s “5G E” phones, they aren’t 5G cellular.
And if you’re hearing that 5G means millimeter-wave towers on every lamppost, that’s not true. That’s only one of the three main forms of 5G we’re seeing right now.
The G in this 5G means it’s a generation of wireless technology. While most generations have technically been defined by their data transmission speeds, each has also been marked by a break in encoding methods, or “air interfaces,” that make it incompatible with the previous generation.
1G was analog cellular. 2G technologies, such as CDMA, GSM, and TDMA, were the first generation of digital cellular technologies. 3G technologies, such as EVDO, HSPA, and UMTS, brought speeds from 200kbps to a few megabits per second. 4G technologies, such as WiMAX and LTE, were the next incompatible leap forward, and they are now scaling up to hundreds of megabits and even gigabit-level speeds.
5G brings three new aspects to the table: bigger channels (to speed up data), lower latency (to be more responsive), and the ability to connect a lot more devices at once (for sensors and smart devices).
The actual 5G radio system, known as 5G-NR, isn’t the same as 4G. But all 5G devices in the US, for now, need 4G because they’ll lean on it to make initial connections before trading up to 5G where it’s available. That’s technically known as a “non standalone,” or NSA, network. Later this year, our 5G networks will become “standalone,” or SA, not requiring 4G coverage to work.
4G will continue to improve with time, as well. The Qualcomm X24 modem, which is built into most 2019 and 2020 Android flagship phones, supports 4G speeds up to 2Gbps. The real advantages of 5G will come in massive capacity and low latency, beyond the levels 4G technologies can achieve.
That symbiosis between 4G and 5G has caused AT&T to get a little overenthusiastic about its 4G network. The carrier has started to call its 4G network “5G Evolution,” because it sees improving 4G as a major step to 5G. It’s right, of course. But the phrasing is designed to confuse less-informed consumers into thinking 5G Evolution is 5G, when it isn’t.
5G networks use a type of encoding called OFDM, which is similar to the encoding that 4G LTE uses. The air interface is designed for much lower latency and greater flexibility than LTE, though.
With the same airwaves as 4G, the 5G radio system can get about 30 percent better speeds thanks to more efficient encoding. The crazy gigabit speeds you hear about are because 5G is designed to use much larger channels than 4G does. While most 4G channels are 20MHz, bonded together into up to 140MHz at a time, 5G channels can be up to 100MHz, with Verizon using as much as 800MHz at a time. That’s a much broader highway, but it also requires larger, clear blocks of airwaves than were available for 4G.
That’s where the higher, short-distance millimeter-wave frequencies come in. While lower frequencies are occupied by 4G, by TV stations, by satellite firms, or by the military, there had been a huge amount of essentially unused higher frequencies available in the US, so carriers could easily construct wide roads for high speeds.
5G networks need to be much smarter than previous systems, as they’re juggling many more, smaller cells that can change size and shape. But even with existing macro cells, Qualcomm says 5G will be able to boost capacity by four times over current systems by leveraging wider bandwidths and advanced antenna technologies.
The goal is to have far higher speeds available, and far higher capacity per sector, at far lower latency than 4G. The standards bodies involved are aiming at 20Gbps speeds and 1ms latency, at which point very interesting things begin to happen.