26.7 C
Miami
Tuesday, December 2, 2025

6G’s Role in Early Tsunami Detection

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

When the head of Nokia Bell Labs core research talks about “lessons learned” from 5G, he’s doing something rare in telecom: admitting a flagship technology didn’t quite work out as planned.

That candor matters now, too, because Bell Labs core research president Peter Vetter says 6G’s success depends on getting infrastructure right the first time—something 5G didn’t fully do.

By 2030, he says, 5G will have exhausted its capacity. Not because some 5G killer app will appear tomorrow, suddenly making everyone’s phones demand 10 or 100 times as much data capacity as they require today. Rather, by the turn of the decade, wireless telecom won’t be centered around just cellphones anymore.

AI agents, autonomous cars, drones, IoT nodes, and sensors, sensors, sensors: Everything in a 6G world will potentially need a way on to the network. That means more than anything else in the remaining years before 6G’s anticipated rollout, high-capacity connections behind cell towers are a key game to win. Which brings industry scrutiny, then, to what telecom folks call backhaul—the high-capacity fiber or wireless links that pass data from cell towers toward the internet backbone. It’s the difference between the “local” connection from your phone to a nearby tower and the “trunk” connection that carries millions of signals simultaneously.

But the backhaul crisis ahead isn’t just about capacity. It’s also about architecture. 5G was designed around a world where phones dominated, downloading video at higher and higher resolutions. 6G is now shaping up to be something else entirely. This inversion—from 5G’s anticipated downlink deluge to 6G’s uplink resurgence—requires rethinking everything at the core level, practically from scratch.

Vetter’s career spans the entire arc of the wireless telecom era—from optical interconnections in the 1990s at Alcatel (a research center pioneering fiber-to-home connections) to his roles at Bell Labs and later Nokia Bell Labs, culminating in 2021 in his current position at the industry’s bellwether institution.

In this conversation, held in November at the Brooklyn 6G Summit in New York, Vetter explains what 5G got wrong, what 6G must do differently, and whether these innovations can arrive before telecom’s networks start running out of room.

5G’s Expensive Miscalculation

IEEE Spectrum: Where is telecom today, halfway between 5G’s rollout and 6G’s anticipated rollout?

Peter Vetter: Today, we have enough spectrum and capacity. But going forward, there will not be enough. The 5G network by the end of the decade will run out of steam. We have traffic simulations. And it is something that has been consistent generation to generation, from 2G to 3G to 4G. Every decade, capacity goes up by about a factor of 10. So you need to prepare for that.

And the challenge for us as researchers is how do you do that in an energy-efficient way? Because the power consumption cannot go up by a factor of 10. The cost cannot go up by a factor of 10. And then, lesson learned from 5G: The idea was, “Oh, we do that in higher spectrum. There is more bandwidth. Let’s go to millimeter wave.” The lesson learned is, okay, millimeter waves have short reach. You need a small cell [tower] every 300 meters or so. And that doesn’t cut it. It was too expensive to install all these small cells.

Is this related to the backhaul question?

Vetter: So backhaul is the connection between the base station and what we call the core of the network—the data centers, and the servers. Ideally, you use fiber to your base station. If you have that fiber as a service provider, use it. It gives you the highest capacity. But very often new cell sites don’t have that fiber backhaul, then there are alternatives: wireless backhaul.

Nokia Bell Labs has pioneered a glass-based chip architecture for telecom’s backhaul signals, communicating between towers and telecom infrastructure.Nokia

Radios Built on Glass Push Frequencies Higher

What are the challenges ahead for wireless backhaul?

Vetter: To get up to the 100 gigabit per second, fiber-like speeds, you need to go to higher frequency bands.

Higher frequency bands for the signals the backhaul antennas use?

Vetter: Yes. The challenge is the design of the radio front ends and the radio-frequency integrated circuits (RFICs) at those frequencies. You cannot really integrate [present-day] antennas with RFICs at those high speeds.

And what happens as those signal frequencies get higher?

Vetter: So in a millimeter wave, say 28 gigahertz, you could still do [the electronics and waveguides] for this with a classical printed circuit board. But as the frequencies go up, the attenuation gets too high.

What happens when you get to, say, 100 GHz?

Vetter: [Conventional materials] are no good anymore. So we need to look at other still low-cost materials. We have done pioneering work at Bell Labs on radio on glass. And we use glass not for its optical transparency, but for its transparency in the sub-terahertz radio range.

Is Nokia Bell Labs making these radio-on-glass backhaul systems for 100 GHz communications?

Vetter: I used an order of magnitude. Above 100 GHz, you need to look into a different material. But [the wavelength range] is actually 140 to 170 GHz, what is called the D-Band.

We collaborate with our internal customers to get these kind of concepts on the long-term roadmap. As an example, that D-Band radio system, we actually integrated it in a prototype with our mobile business group. And we tested it last year at the Olympics in Paris.

But this is, as I said, a prototype. We need to mature the technology between a research prototype and qualifying it to go into production. The researcher on that is Shahriar Shahramian. He’s well-known in the field for this.

Why 6G’s Bandwidth Crisis Isn’t About Phones

What will be the applications that’ll drive the big 6G demands for bandwidth?

Vetter: We’re installing more and more cameras and other types of sensors. I mean, we’re going into a world where we want to create large world models that are synchronous copies of the physical world. So what we will see going forward in 6G is a massive-scale deployment of sensors which will feed the AI models. So a lot of uplink capacity. That’s where a lot of that increase will come from.

Any others?

Vetter: Autonomous cars could be an example. It can also be in industry—like a digital twin of a harbor, and how you manage that? It can be a digital twin of a warehouse, and you query the digital twin, “Where is my product X?” Then a robot will automatically know thanks to the updated digital twin where it is in the warehouse and which route to take. Because it knows where the obstacles are in real time, thanks to that massive-scale sensing of the physical world and then the interpretation with the AI models.

You will have your agents that act on behalf of you to do your groceries, or order a driverless car. They will actively record where you are, make sure that there are also the proper privacy measures in place. So that your agent has an understanding of the state you’re in and can serve you in the most optimal way.

How 6G Networks Will Help Detect Drones, Earthquakes, and Tsunamis

You’ve described before how 6G signals can not only transmit data but also provide sensing. How will that work?

Vetter: The augmentation now is that the network can be turned also in a sensing modality. That if you turn around the corner, a camera doesn’t see you anymore. But the radio still can detect people that are coming, for instance, at a traffic crossing. And you can anticipate that. Yeah, warn a car that, “There’s a pedestrian coming. Slow down.” We also have fiber sensing. And for instance, using fibers at the bottom of the ocean and detecting movements of waves and detect tsunamis, for instance, and do an early tsunami warning.

What are your teams’ findings?

Vetter: The present-day use of tsunami warning buoys are a few hundred kilometers offshore. These tsunami waves travel at 300 and more meters per second, and so you only have 15 minutes to warn the people and evacuate. If you have now a fiber sensing network across the ocean that you can detect it much deeper in the ocean, you can do meaningful early tsunami warning.

We recently detected there was a major earthquake in East Russia. That was last July. And we had a fiber sensing system between Hawaii and California. And we were able to see that earthquake on the fiber. And we also saw the development of the tsunami wave.

6G’s Thousands of Antennas and Smarter Waveforms

Bell Labs was an early pioneer in multiple-input, multiple-output (MIMO) antennas starting in the 1990s. Where multiple transmit and receive antennas could carry many data streams at once. What is Bell Labs doing with MIMO now to help solve these bandwidth problems you’ve described?

Vetter: So, as I said earlier, you want to provide capacity from existing cell sites. And the way to MIMO can do that by a technology called a simplified beamforming: If you want better coverage at a higher frequency, you need to focus your electromagnetic energy, your radio energy, even more. So in order to do that, you need a larger amount of antennas.

So if you double the frequency, we go from 3.5 gigahertz, which is the C-band in 5G, now to 6G, 7 gigahertz. So it’s about double. That means the wavelength is half. So you can fit four times more antenna elements in the same form factor. So physics helps us in that sense.

What’s the catch?

Vetter: Where physics doesn’t help us is more antenna elements means more signal processing, and the power consumption goes up. So here is where the research then comes in. Can we creatively get to these larger antenna arrays without the power consumption going up?

The use of AI is important in this. How can we leverage AI to do channel estimation, to do such things as equalization, to do smart beamforming, to learn the waveform, for instance?

We’ve shown that with these kind of AI techniques, we can get actually up to 30 percent more capacity on the same spectrum.

And that allows many gigabits per second to go out to each phone or device?

Vetter: So gigabits per second is already possible in 5G. We’ve demonstrated that. You can imagine that this could go up, but that’s not really the need. The need is really how many more can you support from a base station?

From Your Site Articles

Related Articles Around the Web

Source link

- Advertisement -spot_imgspot_img

Highlights

- Advertisement -spot_img

Latest News

- Advertisement -spot_img