• CN:11-2187/TH
  • ISSN:0577-6686

机械工程学报 ›› 2020, Vol. 56 ›› Issue (2): 77-85.doi: 10.3901/JME.2020.02.077

• 运载工程 • 上一篇    下一篇

基于单目视觉的松软地面星球车车轮滑转率估计

吕凤天1, 高海波1, 李楠1, 丁亮1, 邓宗全1, 刘光军2   

  1. 1. 哈尔滨工业大学机器人技术与系统国家重点实验室 哈尔滨 150001;
    2. 瑞尔森大学航空航天工程系 多伦多 M2J4A6 加拿大
  • 收稿日期:2019-01-27 修回日期:2019-08-16 出版日期:2020-01-20 发布日期:2020-03-11
  • 通讯作者: 李楠(通信作者),男,1986年出生,博士生。主要研究方向为星球车轮地作用和机器视觉。E-mail:lnlinanln@126.com
  • 作者简介:吕凤天,男,1990年出生,博士研究生。主要研究方向为星球车轮地作用和机器视觉。E-mail:hitlft@163.com;高海波,男,1970年出生,博士生,教授,博士研究生导师。主要研究方向为移动机器人和深空探测。E-mail:gaohaibo@hit.edu.cn
  • 基金资助:
    国家自然科学基金(51822502)、国家自然科学基金委员会创新研究群体科学基金(51521003)、"111"创新引智计划(B07018)和机器人技术与系统国家重点实验室(哈尔滨工业大学)自主课题(SKLRS201501B)资助项目。

Monocular Vision-based Estimation of Wheel Slip Ratio for Planetary Rovers in Soft Terrain

Lü Fengtian1, GAO Haibo1, LI Nan1, DING Liang1, DENG Zongquan1, LIU Guangjun2   

  1. 1. State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin 150001;
    2. Department of Aerospace Engineering, Ryerson University, Toronto M2J4A6, Canada
  • Received:2019-01-27 Revised:2019-08-16 Online:2020-01-20 Published:2020-03-11

摘要: 车轮滑转率的估计对星球车的移动控制具有重要的意义,可以帮助星球车进行定位导航,预防沉陷。通过分析两相邻时刻车轮-地面图像,建立车轮滑转率估计模型,提出一种只利用视觉手段估计车轮滑转率的方法。提取车轮-地面图像中高亮度车辙弱边界,利用两相邻时刻获取的车轮-地面图像中的车辙边界,估计车轮前进位移及线速度。提出车轮-地面图像中车轮标记点的提取方法,利用两相邻时刻获取的车轮-地面图像中的车轮标记点,估计车轮旋转角度和角速度。给出另一种利用编码器和车轮-地面图像估计滑转率的方法,对两种方法进行试验测试,试验结果说明两种方法是有效的,滑转率估计误差均低于9%。滑转率估计方法不仅可以帮助星球车在利用车轮-地面图像检测沉陷量的同时,而且能实现对车轮滑转率的估计,能提高从车轮-地面图像中获取的车轮状态信息数量。

关键词: 滑转率检测, 视觉检测, 星球车, 图像弱边界提取

Abstract: Estimation of wheel slip ratio is of tremendous significance for planetary rover's mobility control, it can help rovers to navigate and prevent wheels sinking into soil. A model for estimation of wheel slip ratio is built through analyzing wheel-terrain images acquired at two at two adjacent moments. A method for estimating wheel slip ratio only using visual means is proposed. The high-brightness weak boundaries of wheel ruts in the wheel-terrain image are extracted. Wheel forward displacement and line speed are estimated using the wheel ruts boundaries in wheel-terrain images acquired at two adjacent moments. A method for extracting mark points of the wheel in the wheel-terrain image is proposed. Wheel rotation angle and angular velocity are estimated using the wheel mark points in the wheel-terrain images acquired at two adjacent moments. Another method for estimating slip ratio using encoder and wheel-terrain images is presented. Two methods are experimentally tested and the results show that the two methods are both effective. The estimation errors of the slip ratio of both two methods are less than 9%. The estimation methods for slip ratio can help rovers to detect the wheel slip ratio based on the wheel-terrain images that can be used to detect the wheel sinkage, and thus improve the number of wheel-motion-state information acquired from the wheel-terrain images.

Key words: slip ratio detection, visual detection, planetary rover, extraction of weak edges in images

中图分类号: