• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用深度神经网络从智能手机加速度计估算车辆行驶方向。

Estimating Vehicle Movement Direction from Smartphone Accelerometers Using Deep Neural Networks.

机构信息

Grupo de Aplicaciones de Procesado de Señales (GAPS), Universidad Politécnica de Madrid, 28040 Madrid, Spain.

出版信息

Sensors (Basel). 2018 Aug 10;18(8):2624. doi: 10.3390/s18082624.

DOI:10.3390/s18082624
PMID:30103422
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6111255/
Abstract

Characterization of driving maneuvers or driving styles through motion sensors has become a field of great interest. Before now, this characterization used to be carried out with signals coming from extra equipment installed inside the vehicle, such as On-Board Diagnostic (OBD) devices or sensors in pedals. Nowadays, with the evolution and scope of smartphones, these have become the devices for recording mobile signals in many driving characterization applications. Normally multiple available sensors are used, such as accelerometers, gyroscopes, magnetometers or the Global Positioning System (GPS). However, using sensors such as GPS increase significantly battery consumption and, additionally, many current phones do not include gyroscopes. Therefore, we propose the characterization of driving style through only the use of smartphone accelerometers. We propose a deep neural network (DNN) architecture that combines convolutional and recurrent networks to estimate the vehicle movement direction (VMD), which is the forward movement directional vector captured in a phone's coordinates. Once VMD is obtained, multiple applications such as characterizing driving styles or detecting dangerous events can be developed. In the development of the proposed DNN architecture, two different methods are compared. The first one is based on the detection and classification of significant acceleration driving forces, while the second one relies on longitudinal and transversal signals derived from the raw accelerometers. The final success rate of VMD estimation for the best method is of 90.07%.

摘要

通过运动传感器对驾驶行为或驾驶风格进行特征描述已成为一个备受关注的领域。在此之前,这种特征描述通常是通过安装在车内的额外设备(如车载诊断 (OBD) 设备或踏板传感器)发出的信号来实现的。如今,随着智能手机的发展和普及,这些设备已经成为许多驾驶特征描述应用中记录移动信号的首选。通常会使用多个可用的传感器,如加速度计、陀螺仪、磁力计或全球定位系统 (GPS)。然而,使用 GPS 等传感器会显著增加电池消耗,此外,许多当前的手机并不包含陀螺仪。因此,我们提出仅使用智能手机加速度计来描述驾驶风格。我们提出了一种深度神经网络 (DNN) 架构,该架构结合了卷积和循环网络来估计车辆运动方向 (VMD),这是在手机坐标系中捕获的向前运动方向向量。一旦获得 VMD,就可以开发出多种应用,如描述驾驶风格或检测危险事件。在提出的 DNN 架构的开发中,比较了两种不同的方法。第一种方法基于对显著加速度驱动力的检测和分类,而第二种方法则依赖于从原始加速度计得出的纵向和横向信号。最佳方法的 VMD 估计最终成功率为 90.07%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/8234130b4bc4/sensors-18-02624-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/86492e85aa88/sensors-18-02624-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/01d014f6551e/sensors-18-02624-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/b0eab9af53cc/sensors-18-02624-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/32d1820e382c/sensors-18-02624-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/d13a531a6c4e/sensors-18-02624-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/a7f5a12edff4/sensors-18-02624-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/dea59b8a2c34/sensors-18-02624-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/47d5dab829cd/sensors-18-02624-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/9357aed8bd99/sensors-18-02624-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/762c1ceb9257/sensors-18-02624-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/3ba6179867d9/sensors-18-02624-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/be068d83b238/sensors-18-02624-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/bf70e9817471/sensors-18-02624-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/3ef95119e34b/sensors-18-02624-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/0bb43810b112/sensors-18-02624-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/55c1249ee129/sensors-18-02624-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/ef04b8eb4439/sensors-18-02624-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/d289639566e8/sensors-18-02624-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/8234130b4bc4/sensors-18-02624-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/86492e85aa88/sensors-18-02624-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/01d014f6551e/sensors-18-02624-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/b0eab9af53cc/sensors-18-02624-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/32d1820e382c/sensors-18-02624-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/d13a531a6c4e/sensors-18-02624-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/a7f5a12edff4/sensors-18-02624-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/dea59b8a2c34/sensors-18-02624-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/47d5dab829cd/sensors-18-02624-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/9357aed8bd99/sensors-18-02624-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/762c1ceb9257/sensors-18-02624-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/3ba6179867d9/sensors-18-02624-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/be068d83b238/sensors-18-02624-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/bf70e9817471/sensors-18-02624-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/3ef95119e34b/sensors-18-02624-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/0bb43810b112/sensors-18-02624-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/55c1249ee129/sensors-18-02624-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/ef04b8eb4439/sensors-18-02624-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/d289639566e8/sensors-18-02624-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/65cd/6111255/8234130b4bc4/sensors-18-02624-g019.jpg

相似文献

1
Estimating Vehicle Movement Direction from Smartphone Accelerometers Using Deep Neural Networks.利用深度神经网络从智能手机加速度计估算车辆行驶方向。
Sensors (Basel). 2018 Aug 10;18(8):2624. doi: 10.3390/s18082624.
2
Measuring Risky Driving Behavior Using an mHealth Smartphone App: Development and Evaluation of gForce.使用移动健康智能手机应用程序测量危险驾驶行为:gForce的开发与评估
JMIR Mhealth Uhealth. 2018 Apr 19;6(4):e69. doi: 10.2196/mhealth.9290.
3
Convolutional Neural Network-Based Classification of Driver's Emotion during Aggressive and Smooth Driving Using Multi-Modal Camera Sensors.基于卷积神经网络的多模态摄像头传感器对激进驾驶和平稳驾驶过程中驾驶员情绪的分类
Sensors (Basel). 2018 Mar 23;18(4):957. doi: 10.3390/s18040957.
4
On the Application of Time Frequency Convolutional Neural Networks to Road Anomalies' Identification with Accelerometers and Gyroscopes.基于时频卷积神经网络的加速度计和陀螺仪道路异常识别应用
Sensors (Basel). 2020 Nov 10;20(22):6425. doi: 10.3390/s20226425.
5
Evaluation of 1D and 2D Deep Convolutional Neural Networks for Driving Event Recognition.一维和二维深度卷积神经网络在驾驶事件识别中的评估。
Sensors (Basel). 2022 Jun 1;22(11):4226. doi: 10.3390/s22114226.
6
Who sits where? Infrastructure-free in-vehicle cooperative positioning via smartphones.谁坐在哪里?通过智能手机实现无基础设施的车内协同定位。
Sensors (Basel). 2014 Jun 30;14(7):11605-28. doi: 10.3390/s140711605.
7
StresSense: Real-Time detection of stress-displaying behaviors.StresSense:实时检测压力表现行为。
Int J Med Inform. 2024 May;185:105401. doi: 10.1016/j.ijmedinf.2024.105401. Epub 2024 Mar 7.
8
Smartphone Location Recognition: A Deep Learning-Based Approach.智能手机定位识别:基于深度学习的方法。
Sensors (Basel). 2019 Dec 30;20(1):214. doi: 10.3390/s20010214.
9
Predicting Human Motion Signals Using Modern Deep Learning Techniques and Smartphone Sensors.利用现代深度学习技术和智能手机传感器预测人体运动信号。
Sensors (Basel). 2021 Dec 10;21(24):8270. doi: 10.3390/s21248270.
10
Smartphone-Based Activity Recognition for Indoor Localization Using a Convolutional Neural Network.基于卷积神经网络的智能手机室内定位活动识别。
Sensors (Basel). 2019 Feb 1;19(3):621. doi: 10.3390/s19030621.

引用本文的文献

1
Gesture-Based Interactions: Integrating Accelerometer and Gyroscope Sensors in the Use of Mobile Apps.基于手势的交互:在移动应用使用中集成加速度计和陀螺仪传感器
Sensors (Basel). 2024 Feb 4;24(3):1004. doi: 10.3390/s24031004.
2
Reducing the Impact of Sensor Orientation Variability in Human Activity Recognition Using a Consistent Reference System.使用一致的参考系统减少人体活动识别中传感器方向变化的影响。
Sensors (Basel). 2023 Jun 23;23(13):5845. doi: 10.3390/s23135845.
3
Experimental Study on Longitudinal Acceleration of Urban Buses and Coaches in Different Road Maneuvers.

本文引用的文献

1
Vehicle Mode and Driving Activity Detection Based on Analyzing Sensor Data of Smartphones.基于智能手机传感器数据分析的车辆模式与驾驶活动检测
Sensors (Basel). 2018 Mar 29;18(4):1036. doi: 10.3390/s18041036.
2
Driver behavior profiling: An investigation with different smartphone sensors and machine learning.驾驶员行为分析:基于不同智能手机传感器和机器学习的调查
PLoS One. 2017 Apr 10;12(4):e0174959. doi: 10.1371/journal.pone.0174959. eCollection 2017.
3
Vehicle Maneuver Detection with Accelerometer-Based Classification.基于加速度计分类的车辆操纵检测
不同道路工况下城市客车和长途客车纵向加速度的试验研究。
Sensors (Basel). 2023 Mar 15;23(6):3125. doi: 10.3390/s23063125.
Sensors (Basel). 2016 Sep 29;16(10):1618. doi: 10.3390/s16101618.
4
Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition.用于多模态可穿戴活动识别的深度卷积和长短期记忆循环神经网络
Sensors (Basel). 2016 Jan 18;16(1):115. doi: 10.3390/s16010115.