• CN: 11-2187/TH
  • ISSN: 0577-6686

Journal of Mechanical Engineering ›› 2022, Vol. 58 ›› Issue (24): 163-177.doi: 10.3901/JME.2022.24.163

Previous Articles     Next Articles

Deep Reinforcement Learning-based Integrated Control of Hybrid Electric Vehicles Driven by High Definition Map in Cloud Control System

TANG Xiaolin1, CHEN Jiaxin1, GAO Bolin2, YANG Kai1, HU Xiaosong1, LI Keqiang2   

  1. 1. College of Mechanical and Vehicle Engineering, Chongqing University, Chongqing 400044;
    2. State Key Laboratory of Automotive Safety and Energy, Tsinghua University, Beijing 100084
  • Received:2022-03-09 Revised:2022-07-15 Online:2022-12-20 Published:2023-04-03

Abstract: In the context of the development of intelligence, connectivity, and new energy, the automotive industry combines computer, information communication, artificial intelligence(AI) to achieve integrated development. Based on the new generation of information and communication technology——cloud control system(CCS) of intelligent and connected vehicles(ICVs), the cloud-level automatic driving of new energy vehicles is realized driven by connected data, which provides innovative planning and control ideas for vehicle driving and power systems. Firstly, based on the resource platform of CCS, the latitude, longitude, altitude, and weather of the target road are obtained, and a high definition(HD) path model including slope, curvature, and steering angle is established. Secondly, a deep reinforcement learning(DRL)-based integrated control method for hybrid electric vehicle(HEV) drive by the HD model is proposed. By adopting two DRL algorithms, the speed and steering of the vehicle and the engine and transmission in the powertrain are controlled, and the synchronous learning of four control strategies is realized. Finally, processor-in-the-loop(PIL) tests are performed by using the high-performance edge computing device NVIDIA Jetson AGX Xavier. The results show that under a variable space including 14 states and 4 actions, the DRL -based integrated control strategy realizes the precise control of the speed and steering of the vehicle layer under the high-speed driving cycle of 172 km, and achieves a fuel consumption of 5.53L/100km. Meanwhile, it only consumes 104.14s in the PIL test, which verifies the optimization and real-time performance of the learning-based multi-objective integrated control strategy.

Key words: cloud control system, high definition map, deep reinforcement learning, hybrid electric vehicle, integrated control

CLC Number: