91黑料

Skip to main content Skip to search

YU News

YU News

Using AI Technology, Your Smartwatch May Soon Understand Your Eating Habits

In an IEEE Internet of Things Journal study, 鈥淒ietWatch: Fine-Grained and Robust Dietary Monitoring via Smartwatch in Real-World Scenarios,鈥 Katz School researchers describe an artificial intelligence system called DietWatch, which analyzes motion and sound signals from a smartwatch to detect when someone is eating, how fast they eat and even what type of food they may be consuming.

By Dave DeFusco

A team of researchers led by the Katz School of Science and Health is turning the everyday smartwatch into a powerful health tool using artificial intelligence to track how people eat, not just what they eat, without requiring them to log a single meal.

, 鈥淒ietWatch: Fine-Grained and Robust Dietary Monitoring via Smartwatch in Real-World Scenarios,鈥 the researchers describe an artificial intelligence system called DietWatch, which analyzes motion and sound signals from a smartwatch to detect when someone is eating, how fast they eat and even what type of food they may be consuming.

Diet plays a major role in health. Poor eating habits are linked to chronic illnesses, such as high blood pressure, diabetes and heart disease, yet accurately tracking diet has long been difficult. Most people rely on food diaries or memory-based surveys, which are often incomplete or inaccurate.

鈥淭raditional approaches depend on people remembering and reporting what they ate, which can lead to errors,鈥 said Yucheng Xie, the corresponding author of the study and an assistant professor in the Katz School鈥檚 Graduate Department of Computer Science and Engineering. 鈥淥ur goal was to create a system that can monitor eating behaviors automatically using a device many people already wear.鈥

Smartwatches are now used by hundreds of millions of people around the world and already contain sensors that measure movement and sound. The researchers realized those sensors could provide clues about eating behavior. 

When someone eats, the wrist often moves in repeated patterns as food is brought to the mouth. At the same time, biting and chewing generate acoustic signals whose characteristics vary with food texture. These signals can be captured by the smartwatch and analyzed together with motion data. By combining these signals, artificial intelligence can learn to recognize eating activity.

Building such a system, however, isn鈥檛 easy. People eat in restaurants with loud background noise, snack while walking or multitask during meals. Everyday movements, such as scratching one鈥檚 face or adjusting glasses, can look similar to eating gestures.

鈥淭o make the system work outside of a laboratory, we had to teach the AI to ignore many different kinds of interference,鈥 said Chengyi Liu, a student in the Katz School鈥檚 M.S. in Artificial Intelligence and a co-author of the study. 鈥淭he challenge is helping the model understand the difference between actual eating and normal daily activities.鈥

DietWatch addresses this challenge using several AI techniques. One part of the system removes noise from motion and sound data collected by the watch. Another part uses a machine learning approach called contrastive learning, which helps the model distinguish eating gestures from other movements.

The system also includes a method designed to work across many different users. Because people have different eating styles, body movements and chewing patterns, a model trained on one person might not work well for another. The researchers developed an approach that extracts patterns common to many users so that the system can function without requiring extensive user-specific retraining.

To test the system, the researchers conducted a study involving 30 participants between the ages of 20 and 59. They wore smartwatches while eating in a variety of situations, including quiet rooms, busy coffee shops, social meals with friends and even while walking indoors.

Over four months, the team collected more than 5,500 minutes of data from over 500 eating sessions. Participants consumed 40 different types of food across categories, such as staple foods, fruits and vegetables, soft foods and crispy snacks.

The results showed that DietWatch could detect eating times with nearly 80% accuracy and identify food types with more than 85% accuracy. It could also estimate biting and chewing rates with relatively small errors.

According to Honggang Wang, senior author of the paper and chair of the Graduate Department of Computer Science and Engineering, the research could open the door to new ways of improving health. He said future versions of DietWatch could estimate calorie intake and nutritional balance by combining smartwatch data with food composition databases. 

The team, which included researchers from Purdue University and Indiana University, is also exploring ways to run more of the AI processing directly on the smartwatch to improve privacy and reduce reliance on cloud computing.

鈥淯nderstanding how people eat, not just what they eat, is critical for addressing many health challenges,鈥 said Wang. 鈥淭his technology can provide detailed, continuous insights into dietary behavior, which could help doctors, nutritionists and individuals make better decisions about diet and long-term health.鈥

Share

FacebookTwitterLinkedInWhat's AppEmailPrint

Follow Us