Lithium-ion batteries are quietly powering large parts of the world, including electric vehicles and smartphones. They have revolutionized how people store and use energy. But as these batteries become more central to daily life, they bring more attention to the challenges of managing them and the energy they store safely, efficiently and intelligently.
I’m a mechanical engineer who studies these nearly ubiquitous batteries. They have been around for decades, yet researchers like me are still trying to fully understand how these batteries behave – especially when they are working hard.
Batteries may seem simple, but they are as complicated as the real-world uses people devise for them.
The big picture
At their core, lithium-ion batteries rely on the movement of charged particles, called ions, of the element lithium between two electric poles, or electrodes. The lithium ions move from the positive electrode to the negative one through a conductive substance called an electrolyte, which can be a solid or a liquid.
How much energy these batteries store and how well they work depends on a tangle of factors, including the temperature, physical structure of the battery and how the materials age over time.
Around the world, researchers are trying to answer questions about each of these factors individually and in concert with each other. Some research focuses on improving lifespan and calculating how batteries degrade over time. Other projects are tackling safety under extreme conditions, such as fast-charging use in extreme climates – either hot or cold. Many are exploring entirely new materials that could make batteries cheaper, longer-lasting or safer. And a significant group – including me – are working with computer simulations to improve real-time battery monitoring.
Real‑time monitoring in your electric vehicle’s battery system functions like a health check: It tracks voltage, current and temperature to estimate how much energy remains so you won’t be stranded with a dead battery.
But it’s difficult to precisely measure how well each of the energy cells within the battery is performing as they age or as the weather changes from cold in winter to hotter in summer. So the battery management system uses a computer simulation to estimate those factors. When combined with real-time monitoring, the system can prevent overusing the battery, balance charging speed with long-term health, avoid power failures and keep performance high. But there are a lot of variables.
The traffic analogy
One of the best ways to understand this challenge is to think about city traffic.
Let’s say you want to drive across town and need to determine whether your car has enough charge to travel the best route. If your navigation simulator accounted for every stoplight, every construction zone and every vehicle on the road, it would give you a very accurate answer. But it might take an hour to run, by which time the circumstances would have changed and the answer would likely be wrong. That’s not helpful if you’re trying to make a decision right now.
A simpler model might assume that every road is clear and every car is moving at the speed limit. That simulation delivers a result instantly – but its results are very inaccurate when traffic is heavy or a road is closed. It doesn’t capture the reality of rush hour.
While you’re driving, the battery management system would do a similar set of calculations to see how much charge is available for the rest of the trip. It would look at the battery’s temperature, how old it is and how much energy the car is asking for, like when going up a steep hill or accelerating quickly to keep up with other cars. But like the navigation simulations, it has to strike a balance between being extremely accurate and giving you useful information before your battery runs out in the middle of your trip.
The most accurate models, which simulate every chemical reaction inside the battery, are too slow for real-time use. The faster models simplify things so much that they miss key behaviors – especially under stress, such as fast charging or sudden bursts of energy use.
AP Photo/Julio Cortez
How researchers are bridging the gap
This trade-off between speed and accuracy is at the heart of battery modeling research today. Scientists and engineers are exploring many ways to solve it.
Some are rewriting modeling software to make the physics calculations more efficient, reducing complexity without losing the key details. Others, like me, are turning to machine learning – training computers to recognize patterns in data and make fast, accurate predictions without having to solve every underlying equation.
In my recent work, I used a high-accuracy battery simulator – one of the ones that’s really accurate but very slow – to generate a massive amount of data about how a battery functions when charging and discharging. I used that data to train a machine learning algorithm called XGBoost, which is particularly good at finding patterns in data.
Then I used software to pair the XGBoost system with a simple, fast-running battery model that captures the basic physics but can miss finer details. The simpler model puts out an initial set of results, and the XGBoost element fine-tunes those to make corrections on the fly, especially when the battery is under strain.
The result is a hybrid model that is able to respond both quickly and accurately to changes in driving conditions. A driver who floors the accelerator with just the simple model wouldn’t get enough energy; a more detailed model would give the right amount of energy only after it finished all its calculations. My hybrid model delivers a rapid boost of energy without delays.
Other teams are working on similar hybrid approaches, blending physics and artificial intelligence in creative ways. Some are even building digital twins – real-time virtual replicas of physical batteries – to offer sophisticated simulations that update constantly as conditions change.

AP Photo/Ross D. Franklin
What’s next
Battery research is moving quickly, and the field is already seeing signs of change. Models are becoming more reliable across a wider range of conditions. Engineers are using real-time monitoring to extend battery life, prevent overheating and improve energy efficiency. Machine learning lets researchers train battery management systems to optimize performance for specific applications, such as high power demands in electric vehicles, daily cycles of home electricity use, short power bursts for drones, or long-duration requirements for building-scale battery systems.
And there’s more to come: Researchers are working to include other important factors into their battery models, such as heat generation and mechanical stress.
Some teams are taking hybrid models and compiling their software into lightweight code that runs on microcontrollers inside battery hardware. In practice, that means each battery pack carries its own brain on-board, calculating state-of-charge, estimating aging and tracking thermal or mechanical stress in near-real time. By embedding the model in the device’s electronics, the pack can autonomously adjust its charging and discharging strategy on the fly, making every battery smarter, safer and more efficient.
As the energy landscape evolves – with more electric vehicles on the road, more renewable energy sources feeding into the grid, and more people relying on batteries in daily life – the ability to understand what a battery is doing in real time becomes more critical than ever.