By Dave DeFusco
As new sensing technologies move from research labs into homes, hospitals and security systems, they promise to make life easier and safer, but they also raise an important question: What happens if someone learns how to fool them?
A study by researchers at the Katz School of Science and Health, George Mason University, Temple University and Rutgers University shows that emerging millimeter-wave sensing systems can be deceived in unexpected ways. The research shows that these systems, which can monitor breathing and heartbeat without contact, may be vulnerable to a surprising new form of spoofing.
The paper has been accepted to IEEE INFOCOM 2026, one of the premier conferences in computer networking. This year鈥檚 conference received 1,740 submissions and accepted 329 papers, reflecting an acceptance rate of 19%.
The study, 鈥淎ttacking mmWave-enabled Chest Vibration Sensing via Actuator-induced Mimicry,鈥 demonstrates how a small programmable device hidden under clothing could imitate a person鈥檚 chest movements closely enough to fool sensing systems used for health monitoring or identity verification.
鈥淥ur research shows that mmWave sensing systems, which many people assume are inherently hard to manipulate, can in fact be spoofed with relatively simple hardware,鈥 said Yucheng Xie, a co-author of the study and an assistant professor in the Department of Graduate Computer Science and Engineering. 鈥淚n the wrong hands, that could allow someone to impersonate another user or inject false physiological data into the system.鈥
Millimeter-wave, often shortened to mmWave, refers to very high-frequency radio signals that can detect extremely small movements. These signals are already widely used in wireless communication and are increasingly being adapted for sensing technologies.
Because the wavelengths are so small, mmWave systems can detect tiny motions, such as the slight rise and fall of a person鈥檚 chest as they breathe or the subtle vibrations caused by a heartbeat. This makes it possible to monitor vital signs from a distance, without sensors attached to the body.
The technology has opened the door to a wide range of applications. In healthcare, mmWave sensors could track patients鈥 breathing or heart rate while they sleep or recover at home. In security systems, the same signals can identify individuals based on their unique breathing patterns. Researchers are even exploring ways to use the technology in emotion-aware computing systems that respond to a user鈥檚 physical state.
The new study, however, shows that these systems may not be as secure as they appear. In the attack demonstrated by the researchers, an adversary first records a target person鈥檚 chest vibration patterns. These patterns can be captured using a mmWave device in a shared environment where the target is present.
The attacker then uses a small programmable actuator to reproduce those movements with carefully controlled vibrations. When hidden beneath clothing, the actuator generates vibrations that mimic the target鈥檚 breathing and heartbeat patterns.
When the mmWave sensor scans the person wearing the device, it can mistake the artificial vibrations for genuine chest motion. In effect, the system believes it is sensing the original person whose patterns were recorded.
Such an attack could have several consequences. In security systems that rely on chest vibration patterns as a biometric marker, an attacker might gain unauthorized access by mimicking another user鈥檚 physiological signature. In healthcare monitoring systems, the spoofed signals could make a patient appear healthy when they are not or trigger false alarms that send unnecessary emergency alerts.
Building a convincing imitation of human chest motion, however, is not simple. A major challenge is that the attacker鈥檚 own breathing and body movements can interfere with the vibrations produced by the actuator. To overcome this, the research team designed a system that measures the attacker鈥檚 own chest motion and adjusts the actuator鈥檚 vibrations in real time.
Artificial intelligence makes the spoofing system adaptive. Machine learning models analyze motion data and continuously adjust the actuator so that the wearer鈥檚 natural chest motion and the artificial signal become a close match to the target pattern. This allows the spoofing device to maintain a convincing signal even as the wearer moves or breathes.
鈥淭he AI component allows the device to keep adapting in real time, so the generated vibrations stay aligned with the target pattern,鈥 said Xie. 鈥淲ithout that adaptive control, the spoofed signal would be far less convincing.鈥
To test the approach, the researchers conducted experiments with eight participants over a six-month period. The results showed that the spoofing method achieved a success rate of more than 80% across multiple testing scenarios, demonstrating that the attack is not just theoretical but achievable. The goal of the research, said Xie, is not to enable attacks but to help designers build stronger and more secure sensing systems.
鈥淎s these technologies become more common in healthcare, smart homes and security systems, it鈥檚 essential to understand their vulnerabilities,鈥 he said. 鈥淏y identifying weaknesses early, we can design better defenses and ensure that these systems remain trustworthy.鈥
The study also points to several ways future systems could defend against this kind of attack. For example, systems could compare chest signals with other physiological signals, such as subtle vibrations in the face, to confirm that the data is coming from a real person rather than an artificial device. Users may also employ privacy tools that mask their physiological signals from unauthorized sensing.
Ultimately, the researchers say the work highlights an important lesson for emerging technologies: innovation must be matched with careful attention to security.
鈥淎s sensing systems become more powerful, they also become more attractive targets,鈥 said Xie. 鈥淭he first step to securing them is understanding how they can be fooled.鈥