Ghasemzadeh Hassan
Annu Int Conf IEEE Eng Med Biol Soc. 2016 Aug;2016:3105-3108. doi: 10.1109/EMBC.2016.7591386.
In this paper, we introduce an Asynchronous Multiview Learning (AML) approach to allow accurate transfer of activity classification models across asynchronous sensor views. Our study is motivated by the highly dynamic nature of health monitoring using wearable sensors. Such dynamics include changes in sensing platform (e.g., sensor upgrade) and platform settings (e.g., sampling frequency, on-body sensor location), which result in failure of the machine learning algorithms if they remain untrained in the new setting. Our approach allows machine learning algorithms to automatically reconfigure without any need for labeled training data in the new setting. Our evaluation using real data collected with wearable motion sensors demonstrates that the average classification accuracy using our automatically labeled training data is 85.2%. This accuracy is only 3.4% to 4.5% less than the experimental upper bound, where ground truth labeled training data are used to develop a new activity recognition classifier.
在本文中,我们介绍了一种异步多视图学习(AML)方法,以实现跨异步传感器视图的活动分类模型的准确迁移。我们的研究源于使用可穿戴传感器进行健康监测的高度动态特性。这种动态特性包括传感平台的变化(例如,传感器升级)和平台设置(例如,采样频率、身体上传感器的位置),如果机器学习算法在新设置中未经过训练,这些变化会导致算法失效。我们的方法允许机器学习算法自动重新配置,而无需在新设置中使用带标签的训练数据。我们使用可穿戴运动传感器收集的真实数据进行的评估表明,使用我们自动标记的训练数据的平均分类准确率为85.2%。该准确率比使用真实标签训练数据开发新活动识别分类器的实验上限仅低3.4%至4.5%。