How much information about a person’s behavior can you ascertain from a wearable device? Researchers at the University of Cambridge and Alan Turing Institute sought to investigate this in a study involving over 280,000 hours of wrist accelerometer and wearable electrocardiogram (ECG) data. They claim they successfully trained classifiers on the dataset to capture physiologically meaningful, personalized information with greater than 70% AUC (the ratio of true positives to false positives, where closer to one is better), including variables associated with individuals’ health, physical fitness, and demographic characteristics.
Wearables like the Apple Watch have allowed people to track activities unobtrusively. But extracting meaning from the data devices record can be challenging because wearable sensors often measure low-level signals like acceleration as opposed to high-level events of interest, such as arrhythmia, infection, or the onset of obesity. Machine learning has shown great promise in human activity recognition tasks using wearable sensor data, but research has mostly relied on labeled datasets that are costly to compile and annotate.
The researchers in this study developed an AI system called Step2Heart, a general-purpose self-supervised feature extractor for wearable device data that generates person-specific profiles. (“Self-supervised” implies that the system labels the data itself instead of relying on human annotations, as is typically the case.) Step2Heart draws on the profiles to predict health-related outcomes with classifiers through transfer learning, a machine learning technique in which a model developed for a task is reused as the starting point for a model on a second task.
The dataset was derived from 2,100 men and women aged 35-65 as part of the Fenland Study, which investigates the interaction between environmental and genetic factors in determining obesity, type 2 diabetes, and related metabolic disorders. Participants were asked to wear a combined heart rate and movement chest sensor and a wrist accelerometer for a full week. During a lab visit, participants performed a treadmill test to determine their maximum oxygen consumption rate. Their resting heart rate was also recorded using the chest ECG.
After preprocessing, the dataset Step2Heart was trained on spanned 1,506 participants with both sensor and lab visit data. In experiments, the researchers attempted to predict various factors, including blood oxygen consumption rate, height, weight, sex, age, BMI, resting heart rate, and physical activity energy expenditure (PAEE).
The report’s coauthors claim Step2Heart managed to classify sex with 0.93 AUC, height with 0.82, PAEE with 0.80, and weight with 0.77. They said it handled more challenging metrics like BMI, blood oxygen, and age with around 0.70 AUC. However, it’s worth noting that their work doesn’t consider the potential impact of imbalance in the dataset. Research has shown that much of the data used to train algorithms for health applications may perpetuate inequalities along racial, ethnic, geographic, and gender lines.
Despite this, the researchers assert that their study shows unlabeled wearable data can be used to learn profiles that generalize in situations where collecting ground truth isn’t feasible. “Such scenarios are of great importance in mobile health, where we may be able to achieve clinical-grade health inferences with widely adopted devices,” wrote the coauthors, who plan to present their work during the Machine Learning for Mobile Health workshop at the upcoming online NeurIPS 2020 conference. “Our work makes contributions in the area of transfer learning and personalized representations, one of utmost importance in machine learning for health.”
How startups are scaling communication:
The pandemic is making startups take a close look at ramping up their communication solutions. Learn how