Manton Sean, Magerowski Greta, Patriarca Laura, Alonso-Alonso Miguel
Laboratory of Bariatric and Nutritional Neuroscience, Center for the Study of Nutrition Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School Boston, MA, USA.
Front Psychol. 2016 Feb 11;7:142. doi: 10.3389/fpsyg.2016.00142. eCollection 2016.
Studying how humans eat in the context of a meal is important to understanding basic mechanisms of food intake regulation and can help develop new interventions for the promotion of healthy eating and prevention of obesity and eating disorders. While there are a number of methodologies available for behavioral evaluation of a meal, there is a need for new tools that can simplify data collection through automatic and online analysis. Also, there are currently no methods that leverage technology to add a dimension of interactivity to the meal table. In this study, we examined the feasibility of a new technology for automatic detection and classification of bites during a laboratory meal. We used a SUR40 multi-touch tabletop computer, powered by an infrared camera behind the screen. Tags were attached to three plates, allowing their positions to be tracked, and the saturation (a measure of the infrared intensity) in the surrounding region was measured. A Kinect camera was used to record the meals for manual verification and provide gesture detection for when the bites were taken. Bite detections triggered classification of the source plate by the SUR40 based on saturation flux in the preceding time window. Five healthy subjects (aged 20-40 years, one female) were tested, providing a total sample of 320 bites. Sensitivity, defined as the number of correctly detected bites out of the number of actual bites, was 67.5%. Classification accuracy, defined as the number of correctly classified bites out of those detected, was 82.4%. Due to the poor sensitivity, a second experiment was designed using a single plate and a Myo armband containing a nine-axis accelerometer as an alternative method for bite detection. The same subjects were tested (sample: 195 bites). Using a simple threshold on the pitch reading of the magnetometer, the Myo data achieved 86.1% sensitivity vs. 60.5% with the Kinect. Further, the precision of positive predictive value was 72.1% for the Myo vs. 42.8% for the Kinect. We conclude that the SUR40 + Myo combination is feasible for automatic detection and classification of bites with adequate accuracy for a range of applications.
研究人类在进餐过程中的饮食方式对于理解食物摄入调节的基本机制很重要,并且有助于开发促进健康饮食以及预防肥胖和饮食失调的新干预措施。虽然有多种方法可用于进餐行为评估,但仍需要新工具,这些工具能够通过自动和在线分析简化数据收集。此外,目前还没有利用技术为餐桌增添交互维度的方法。在本研究中,我们检验了一种用于在实验室进餐期间自动检测和分类咬食动作的新技术的可行性。我们使用了一台由屏幕后方的红外摄像头供电的SUR40多点触摸桌面计算机。标签被贴在三个盘子上,以便跟踪它们的位置,并测量周围区域的饱和度(红外强度的一种度量)。使用一台Kinect摄像头记录进餐情况以供人工验证,并在咬食发生时提供手势检测。咬食检测会触发SUR40根据前一个时间窗口中的饱和通量对源盘进行分类。对五名健康受试者(年龄在20至40岁之间,一名女性)进行了测试,共提供了320次咬食的样本。灵敏度定义为正确检测到的咬食次数占实际咬食次数的比例,为67.5%。分类准确率定义为检测到的咬食中正确分类的咬食次数占比,为82.4%。由于灵敏度较低,设计了第二个实验,使用单个盘子和一个包含九轴加速度计的Myo臂带作为咬食检测的替代方法。对相同的受试者进行了测试(样本:195次咬食)。通过对磁力计俯仰读数使用简单阈值,Myo数据的灵敏度达到了86.1%,而Kinect为60.5%。此外,Myo的阳性预测值精度为72.1%,而Kinect为42.8%。我们得出结论,SUR40 + Myo组合对于自动检测和分类咬食动作是可行的,在一系列应用中具有足够的准确性。