• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用手机摄像头和惯性测量单元数据估计地球坐标系中的天线姿态

Estimation of Antenna Pose in the Earth Frame Using Camera and IMU Data from Mobile Phones.

作者信息

Wang Zhen, Jin Bingwen, Geng Weidong

机构信息

College of Computer Science and Technology, Zhejiang University, Zhejiang 310000, China.

出版信息

Sensors (Basel). 2017 Apr 8;17(4):806. doi: 10.3390/s17040806.

DOI:10.3390/s17040806
PMID:28397765
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5422167/
Abstract

The poses of base station antennas play an important role in cellular network optimization. Existing methods of pose estimation are based on physical measurements performed either by tower climbers or using additional sensors attached to antennas. In this paper, we present a novel non-contact method of antenna pose measurement based on multi-view images of the antenna and inertial measurement unit (IMU) data captured by a mobile phone. Given a known 3D model of the antenna, we first estimate the antenna pose relative to the phone camera from the multi-view images and then employ the corresponding IMU data to transform the pose from the camera coordinate frame into the Earth coordinate frame. To enhance the resulting accuracy, we improve existing camera-IMU calibration models by introducing additional degrees of freedom between the IMU sensors and defining a new error metric based on both the downtilt and azimuth angles, instead of a unified rotational error metric, to refine the calibration. In comparison with existing camera-IMU calibration methods, our method achieves an improvement in azimuth accuracy of approximately 1.0 degree on average while maintaining the same level of downtilt accuracy. For the pose estimation in the camera coordinate frame, we propose an automatic method of initializing the optimization solver and generating bounding constraints on the resulting pose to achieve better accuracy. With this initialization, state-of-the-art visual pose estimation methods yield satisfactory results in more than 75% of cases when plugged into our pipeline, and our solution, which takes advantage of the constraints, achieves even lower estimation errors on the downtilt and azimuth angles, both on average (0.13 and 0.3 degrees lower, respectively) and in the worst case (0.15 and 7.3 degrees lower, respectively), according to an evaluation conducted on a dataset consisting of 65 groups of data. We show that both of our enhancements contribute to the performance improvement offered by the proposed estimation pipeline, which achieves downtilt and azimuth accuracies of respectively 0.47 and 5.6 degrees on average and 1.38 and 12.0 degrees in the worst case, thereby satisfying the accuracy requirements for network optimization in the telecommunication industry.

摘要

基站天线的姿态在蜂窝网络优化中起着重要作用。现有的姿态估计方法基于由塔上作业人员进行的物理测量或使用附着在天线上的额外传感器。在本文中,我们提出了一种基于天线的多视图图像和手机捕获的惯性测量单元(IMU)数据的新型非接触式天线姿态测量方法。给定天线的已知三维模型,我们首先从多视图图像中估计天线相对于手机摄像头的姿态,然后利用相应的IMU数据将姿态从摄像头坐标系转换到地球坐标系。为了提高结果的准确性,我们通过在IMU传感器之间引入额外的自由度并基于下倾角和方位角定义一个新的误差度量(而不是统一的旋转误差度量)来改进现有的摄像头-IMU校准模型,以完善校准。与现有的摄像头-IMU校准方法相比,我们的方法在保持下倾角精度相同水平的同时,平均方位精度提高了约1.0度。对于摄像头坐标系中的姿态估计,我们提出了一种自动初始化优化求解器并对结果姿态生成边界约束的方法,以实现更高的精度。通过这种初始化,当接入我们的流程时,最先进的视觉姿态估计方法在超过75%的情况下产生了令人满意的结果,并且我们利用这些约束的解决方案在平均下倾角和方位角上(分别低0.13度和0.3度)以及在最坏情况下(分别低0.15度和7.3度)实现了更低的估计误差,这是根据对由65组数据组成的数据集进行的评估得出的。我们表明,我们的两项改进都有助于所提出的估计流程的性能提升,该流程平均下倾角和方位角精度分别达到0.47度和5.6度,在最坏情况下分别为1.38度和12.0度,从而满足了电信行业网络优化的精度要求。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/841180cde500/sensors-17-00806-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/2c4014afeb17/sensors-17-00806-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/9eace30b71d8/sensors-17-00806-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/58ffc028c0f4/sensors-17-00806-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/64e01f464c36/sensors-17-00806-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/89459acd6d5b/sensors-17-00806-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/9bfd5cef8f46/sensors-17-00806-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/eefdb5796790/sensors-17-00806-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/5a446ca5cdb8/sensors-17-00806-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/3a3d0e612977/sensors-17-00806-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/7bc105471124/sensors-17-00806-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/251ca1eba5be/sensors-17-00806-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/f84b74037aee/sensors-17-00806-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/1ec54e974f8d/sensors-17-00806-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/cc302151f014/sensors-17-00806-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/c6ad5404fcd7/sensors-17-00806-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/824644dd0942/sensors-17-00806-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/841180cde500/sensors-17-00806-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/2c4014afeb17/sensors-17-00806-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/9eace30b71d8/sensors-17-00806-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/58ffc028c0f4/sensors-17-00806-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/64e01f464c36/sensors-17-00806-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/89459acd6d5b/sensors-17-00806-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/9bfd5cef8f46/sensors-17-00806-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/eefdb5796790/sensors-17-00806-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/5a446ca5cdb8/sensors-17-00806-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/3a3d0e612977/sensors-17-00806-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/7bc105471124/sensors-17-00806-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/251ca1eba5be/sensors-17-00806-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/f84b74037aee/sensors-17-00806-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/1ec54e974f8d/sensors-17-00806-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/cc302151f014/sensors-17-00806-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/c6ad5404fcd7/sensors-17-00806-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/824644dd0942/sensors-17-00806-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/85b1/5422167/841180cde500/sensors-17-00806-g017.jpg

相似文献

1
Estimation of Antenna Pose in the Earth Frame Using Camera and IMU Data from Mobile Phones.利用手机摄像头和惯性测量单元数据估计地球坐标系中的天线姿态
Sensors (Basel). 2017 Apr 8;17(4):806. doi: 10.3390/s17040806.
2
Online IMU Self-Calibration for Visual-Inertial Systems.视觉惯性系统的在线惯性测量单元自校准
Sensors (Basel). 2019 Apr 4;19(7):1624. doi: 10.3390/s19071624.
3
Optimization-Based Online Initialization and Calibration of Monocular Visual-Inertial Odometry Considering Spatial-Temporal Constraints.基于优化的考虑时空约束的单目视觉惯性里程计在线初始化与校准
Sensors (Basel). 2021 Apr 10;21(8):2673. doi: 10.3390/s21082673.
4
Hybrid Indoor Localization Using IMU Sensors and Smartphone Camera.基于惯性测量单元传感器和智能手机摄像头的混合室内定位。
Sensors (Basel). 2019 Nov 21;19(23):5084. doi: 10.3390/s19235084.
5
Artificial Marker and MEMS IMU-Based Pose Estimation Method to Meet Multirotor UAV Landing Requirements.基于人工标记和微机电系统惯性测量单元的位姿估计方法,以满足多旋翼无人机着陆要求。
Sensors (Basel). 2019 Dec 9;19(24):5428. doi: 10.3390/s19245428.
6
On Inertial Body Tracking in the Presence of Model Calibration Errors.存在模型校准误差时的惯性人体跟踪
Sensors (Basel). 2016 Jul 22;16(7):1132. doi: 10.3390/s16071132.
7
Resolving Position Ambiguity of IMU-Based Human Pose with a Single RGB Camera.基于单 RGB 相机的 IMU 人体姿态位置歧义消解。
Sensors (Basel). 2020 Sep 23;20(19):5453. doi: 10.3390/s20195453.
8
LiDAR-Stabilised GNSS-IMU Platform Pose Tracking.激光雷达稳定的 GNSS-IMU 平台姿态跟踪。
Sensors (Basel). 2022 Mar 14;22(6):2248. doi: 10.3390/s22062248.
9
Extended Kalman filter-based methods for pose estimation using visual, inertial and magnetic sensors: comparative analysis and performance evaluation.基于扩展卡尔曼滤波器的视觉、惯性和磁传感器姿态估计方法:比较分析与性能评估。
Sensors (Basel). 2013 Feb 4;13(2):1919-41. doi: 10.3390/s130201919.
10
A Novel IMU Extrinsic Calibration Method for Mass Production Land Vehicles.一种用于量产陆地车辆的新型惯性测量单元(IMU)外部校准方法。
Sensors (Basel). 2020 Dec 22;21(1):7. doi: 10.3390/s21010007.

本文引用的文献

1
Real-Time Tracking of Single and Multiple Objects from Depth-Colour Imagery Using 3D Signed Distance Functions.使用3D有符号距离函数从深度彩色图像中实时跟踪单个和多个物体
Int J Comput Vis. 2017;124(1):80-95. doi: 10.1007/s11263-016-0978-2. Epub 2017 Jan 11.
2
Real-Time 3D Tracking and Reconstruction on Mobile Phones.手机上的实时3D跟踪与重建
IEEE Trans Vis Comput Graph. 2015 May;21(5):557-70. doi: 10.1109/TVCG.2014.2355207.
3
Gradient response maps for real-time detection of textureless objects.用于实时检测无纹理物体的梯度响应图。
IEEE Trans Pattern Anal Mach Intell. 2012 May;34(5):876-88. doi: 10.1109/TPAMI.2011.206.