一种面向智能驾驶的毫米波雷达与激光雷达融合的鲁棒感知算法

党相卫 秦斐 卜祥玺 梁兴东

党相卫, 秦斐, 卜祥玺, 等. 一种面向智能驾驶的毫米波雷达与激光雷达融合的鲁棒感知算法[J]. 雷达学报, 2021, 10(4): 622–631. doi: 10.12000/JR21036
引用本文: 党相卫, 秦斐, 卜祥玺, 等. 一种面向智能驾驶的毫米波雷达与激光雷达融合的鲁棒感知算法[J]. 雷达学报, 2021, 10(4): 622–631. doi: 10.12000/JR21036
DANG Xiangwei, QIN Fei, BU Xiangxi, et al. A robust perception algorithm based on a radar and LiDAR for intelligent driving[J]. Journal of Radars, 2021, 10(4): 622–631. doi: 10.12000/JR21036
Citation: DANG Xiangwei, QIN Fei, BU Xiangxi, et al. A robust perception algorithm based on a radar and LiDAR for intelligent driving[J]. Journal of Radars, 2021, 10(4): 622–631. doi: 10.12000/JR21036

一种面向智能驾驶的毫米波雷达与激光雷达融合的鲁棒感知算法

doi: 10.12000/JR21036
基金项目: 国家部委基金
详细信息
    作者简介:

    党相卫(1992–),男,山东枣庄人,中国科学院空天信息创新研究院在读博士研究生,主要研究方向为多传感器数据融合自主导航技术。

    秦斐:秦 斐(1994–),男,河南安阳人,中国科学院空天信息创新研究院在读博士研究生,主要研究方向为雷达图像微弱目标变化检测。

    卜祥玺(1991–),男,山东济南人,博士,中国科学院空天信息创新研究院助理研究员,主要研究方向为合成孔径雷达成像处理。

    梁兴东(1973–),男,陕西西安人,博士,中国科学院空天信息创新研究院研究员,博士生导师,主要研究方向为高分辨率合成孔径雷达系统、成像处理应用、实时信号处理。

    通讯作者:

    梁兴东 xdliang@mail.ie.ac.cn

  • 责任主编:张增辉 Corresponding Editor: ZHANG Zenghui
  • 中图分类号: TN959.5

A Robust Perception Algorithm Based on a Radar and LiDAR for Intelligent Driving

Funds: The National Ministries Foundation
More Information
  • 摘要: 基于多传感器融合感知是实现汽车智能驾驶的关键技术之一,已成为智能驾驶领域的热点问题。然而,由于毫米波雷达分辨率有限,且易受噪声、杂波、多径等因素的干扰,激光雷达易受天气的影响,现有的融合算法很难实现这两种传感器数据的精确融合,得到鲁棒的结果。针对智能驾驶中准确鲁棒的感知问题,该文提出了一种融合毫米波雷达和激光雷达鲁棒的感知算法。使用基于特征的两步配准的空间校正新方法,实现了三维激光点云和二维毫米波雷达点云精确的空间同步。使用改进的毫米波雷达滤波算法,减少了噪声、多径等对毫米波雷达点云的影响。然后根据该文提出的新颖的融合方法对两种传感器的数据进行融合,得到准确鲁棒的感知结果,解决了烟雾对激光性能影响的问题。最后,通过实际场景的实验测试,验证了该文算法的有效性和鲁棒性,即使在烟雾等极端环境中仍然能够实现准确和鲁棒的感知。使用该文融合方法建立的环境地图更加精确,得到的定位结果比使用单一传感器的定位误差减少了至少50%。

     

  • 图  1  烟雾环境中相机和激光雷达数据

    Figure  1.  Camera and LiDAR data in a smoky environment

    图  2  多传感器信息融合框图

    Figure  2.  Block diagram of multi-sensor fusion

    图  3  时间同步示意图

    Figure  3.  Time synchronization diagram

    图  4  空间同步流程图

    Figure  4.  Space synchronization flow chart

    图  5  多传感器空间校正结果

    Figure  5.  Multi-sensor spatial calibration results

    图  6  毫米波雷达点云滤波结果

    Figure  6.  Millimeter wave radar point cloud filtering results

    图  7  毫米波雷达点云三级滤波流程图

    Figure  7.  Three-stage filtering flow chart of millimeter wave radar point cloud

    图  8  激光雷达点云烟雾识别结果

    Figure  8.  LiDAR point cloud smoke recognition results

    图  9  烟雾环境下实验结果

    Figure  9.  Experimental results in smoky environment

    图  10  使用激光雷达和毫米波雷达进行感知建图结果

    Figure  10.  The mapping results based on LiDAR and millimeter-wave radar

    图  11  烟雾环境中多传感器数据融合结果

    Figure  11.  Multi-sensor data fusion results in a smog environment

    图  12  传感器数据融合之后建图结果

    Figure  12.  Mapping results after sensor data fusion

    图  13  使用不同传感器数据定位结果

    Figure  13.  The localization results using different sensor data

    表  1  毫米波雷达系统参数

    Table  1.   Millimeter wave radar system parameters

    参数
    频率(GHz)92
    带宽(GHz)2
    距离分辨率(m)0.07
    方位向波束宽度(°)0.6
    扫描范围(°)0~360
    下载: 导出CSV
  • [1] 李鑫. 面向汽车智能驾驶的毫米波雷达建模与仿真研究[D]. [博士论文], 吉林大学, 2020.

    LI Xin. Research on modeling and simulation of millimeter wave radar for vehicle intelligent driving[D]. [Ph. D. dissertation], Jilin University, 2020.
    [2] 马兴. 无人驾驶汽车中的几种重要传感器应用研究[J]. 数字技术与应用, 2020, 38(5): 107, 109. doi: 10.19695/j.cnki.cn12-1369.2020.05.62

    MA Xing. Application of several important sensors research on driver-less vehicles[J]. Digital Technology &Application, 2020, 38(5): 107, 109. doi: 10.19695/j.cnki.cn12-1369.2020.05.62
    [3] 崔巍杰. 毫米波和激光雷达数据融合的SLAM算法研究[D]. [硕士论文], 电子科技大学, 2019.

    CUI Weijie. SLAM algorithm based on millimeter wave radar and lidar data fusion[D]. [Master dissertation], University of Electronic Science and Technology of China, 2019.
    [4] YAMAUCHI B. Fusing ultra-wideband radar and lidar for small UGV navigation in all-weather conditions[C]. SPIE 7692, Unmanned Systems Technology XII, Orlando, United States, 2010: 76920O. doi: 10.1117/12.850386.
    [5] FRITSCHE P, KUEPPERS S, BRIESE G, et al. Radar and LiDAR sensorfusion in low visibility environments[C]. The 13th International Conference on Informatics in Control, Automation and Robotics, Lisbon, Portugal, 2016: 30–36. doi: 10.5220/0005960200300036.
    [6] FRITSCHE P, KUEPPERS S, BRIESE G, et al. Fusing LiDAR and radar data to perform SLAM in harsh environments[M]. MADANI K, PEAUCELLE D, and GUSIKHIN O. Informatics in Control, Automation and Robotics. Cham: Springer, 2018: 177–189. doi: 10.1007/978-3-319-55011-4_9.
    [7] FRITSCHE P and WAGNER B. Modeling structure and aerosol concentration with fused radar and LiDAR data in environments with changing visibility[C]. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, Canada, 2017: 2685–2690. doi: 10.1109/IROS.2017.8206093.
    [8] PRITSCHE P, ZEISE B, HEMME P, et al. Fusion of radar, LiDAR and thermal information for hazard detection in low visibility environments[C]. 2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), Shanghai, China, 2017: 96–101. doi: 10.1109/SSRR.2017.8088146.
    [9] MARCK J W, MOHAMOUD A, HOUWEN E V, et al. Indoor radar SLAM a radar application for vision and GPS denied environments[C]. 2013 European Radar Conference, Nuremberg, Germany, 2013: 471–474.
    [10] PARK Y S, KIM J, and KIM A. Radar localization and mapping for indoor disaster environments via multi-modal registration to prior LiDAR map[C]. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 2019: 1307–1314. doi: 10.1109/IROS40897.2019.8967633.
    [11] WANG Xiao, XU Linhai, SUN Hongbin, et al. Bionic vision inspired on-road obstacle detection and tracking using radar and visual information[C]. 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 2014: 39–44. doi: 10.1109/ITSC.2014.6957663.
    [12] ALENCAR F A R, ROSERO L A, FILHO C M, et al. Fast metric tracking by detection system: Radar blob and camera fusion[C]. 2015 12th Latin American Robotics Symposium and 2015 3rd Brazilian Symposium on Robotics (LARS-SBR), Uberlandia, Brazil, 2015: 120–125. doi: 10.1109/LARS-SBR.2015.59.
    [13] HAN Siyang, WANG Xiao, XU Linhai, et al. Frontal object perception for Intelligent Vehicles based on radar and camera fusion[C]. 2016 35th Chinese Control Conference (CCC), Chengdu, China, 2016: 4003–4008. doi: 10.1109/ChiCC.2016.7553978.
    [14] KIM J, HAN D S, and SENOUCI B. Radar and vision sensor fusion for object detection in autonomous vehicle surroundings[C]. 2018 Tenth International Conference on Ubiquitous and Future Networks (ICUFN), Prague, Czech Republic, 2018: 76–78. doi: 10.1109/ICUFN.2018.8436959.
    [15] HENG L. Automatic targetless extrinsic calibration of multiple 3D LiDARs and radars[C]. 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, USA, 2020: 10669–10675. doi: 10.1109/IROS45743.2020.9340866.
    [16] QIN Fei, LIU Yunlong, and LIANG Xingdong. A novel GOSD-CFAR for millimeter wave radar detection[C]. 2020 IEEE International Geoscience and Remote Sensing Symposium (IGARSS 2020), Waikoloa, USA, 2020: 1782–1785. doi: 10.1109/IGARSS39084.2020.9324693.
    [17] CRISTIANINI N, SHAWE-TAYLO J, 李国正, 王猛, 曾华军, 译. 支持向量机导论[M]. 北京: 电子工业出版社, 2004: 82–85.

    CRISTIANINI N and SHAWE-TAYLO J, LI Guozheng, WANG Meng, and ZENG Huajun. translation. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods[M]. Beijing: Publishing House of Electronics Industry, 2004: 82–85.
    [18] DANG Xiangwei, RONG Zheng, and LIANG Xingdong. Sensor fusion-based approach to eliminating moving objects for SLAM in dynamic environments[J]. Sensors, 2021, 21(1): 230. doi: 10.3390/s21010230
    [19] DANG Xiangwei, LIANG Xingdong, LI Yanlei, et al. Moving objects elimination towards enhanced dynamic SLAM fusing LiDAR and mmW-radar[C]. 2020 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), Linz, Austria, 2020: 1–4. doi: 10.1109/ICMIM48759.2020.9298986.
    [20] ZHANG Ji and SINGH S. Low-drift and real-time lidar odometry and mapping[J]. Autonomous Robots, 2017, 41(2): 401–416. doi: 10.1007/s10514-016-9548-2
  • 加载中
图(13) / 表(1)
计量
  • 文章访问数:  457
  • HTML全文浏览量:  204
  • PDF下载量:  163
  • 被引次数: 0
出版历程
  • 收稿日期:  2021-03-19
  • 修回日期:  2021-04-28
  • 网络出版日期:  2021-05-20
  • 刊出日期:  2021-08-28

目录

    /

    返回文章
    返回