mmWave Radar: Enhancing Resolution, Target Recognition, and Fusion with Other Sensors
Author
Zhang, Renyuan
Issue Date
2019Keywords
Kalman filtermicro-Doppler signatures
mmWave radar
sensor fusion
signal processing
synthetic aperture radar
Advisor
Cao, Siyang
Metadata
Show full item recordPublisher
The University of Arizona.Rights
Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction, presentation (such as public display or performance) of protected items is prohibited except with permission of the author.Abstract
Over the last decade, the advanced driver assistance system (ADAS) and autonomous driving research have grown rapidly. The entire automotive industry is looking forward to autonomous vehicles and ADAS technologies. Fully autonomous driving by the automobile model year 2021/2022 with security level 4 or 5 requires the use of multiple heterogeneous sensors' system. Automotive sensors, such as camera, millimeter (mmWave) radar and lidar, have evolved fast in signal processing for the perception of surroundings. Sensor fusion and deep learning to understand the environment implemented in automobiles are drastically changing the current sensor research. The automotive radar has been served as an essential sensor in the race to develop ADAS and autonomous vehicles. Its affordable price and reliable detection are raising attention from both industry and academia. In 2018, shipments of passenger automotive radars have grown 54 % in units compared to 2017. Another trend is that with camera and radar getting fused, it can provide more reliable ADAS capabilities. In this dissertation, a series of signal processing techniques are studied for improving the resolution and target recognition of mmWave radar. First, a sensor fusion technique for better tracking and detecting targets using mmWave radar and camera is presented. The fusion system takes consideration of error bounds (EBs) of the two different coordinate systems from the heterogeneous sensors, and further designed a new fusion extended Kalman filter (fusion-EKF) to adapt to the two sensors. The details such as synchronization between sensors, multi-target tracking, and association are also considered and illustrated. The experiment shows that the proposed fusion system can realize a range accuracy of 0.29 m with an angular accuracy of 0.013 rad in real-time. Therefore, the proposed fusion system is effective, reliable and computationally efficient for real-time kinematic fusion applications. A clustering method, REDBSCAN, for radar point cloud data is also presented. Secondly, for enhancing target recognition, a neural network is developed for mmWave radar to classify human behavior in real-time. Thirdly, to improve the angular resolution for mmWave radar, a circular synthetic aperture radar MMWCSAR with high-resolution technique, e.g., compressed sensing is presented.Type
textElectronic Dissertation
Degree Name
Ph.D.Degree Level
doctoralDegree Program
Graduate CollegeElectrical & Computer Engineering