Anytime Computing Techniques for Lidar-Based Perception in Cyber-Physical Systems
Michael Branicky
Prasad Kulkarni
Hongyang Sun
Shawn Keshmiri
The pursuit of autonomy in cyber-physical systems (CPS) presents a challenging task of real-time interaction with the physical world, prompting extensive research in this domain. Recent advancements in artificial intelligence (AI), particularly the introduction of deep neural networks (DNNs), have significantly enhanced CPS autonomy, notably boosting perception capabilities.
CPS perception aims to discern, classify, and track the objects of interest in the operational environment, a task considerably challenging for computers in three-dimensional (3D) space. For this task of detecting objects, leveraging lidar sensors and processing their readings with deep neural networks (DNN) has become popular due to their excellent performance.
However, in systems like self-driving cars and drones, object detection must be both accurate and timely, posing a challenge due to the high computational demand of lidar object detection DNNs. Furthermore, lidar object detection DNNs lack the capability to dynamically reduce their execution time by compromising accuracy (i.e. anytime computing). This adaptability is crucial since deadline constraints can change based on the operational environment and the internal status of the system.
Prior research aimed at anytime computing for object detection DNNs using camera images are not applicable when considered to lidar-based detection due to architectural differences. Addressing this challenge, this thesis focuses on proposing novel techniques, such as Anytime-Lidar and VALO (Versatile Anytime Lidar Object Detection). These innovations aim to enable lidar-based object detection DNNs to make effective tradeoffs between latency and accuracy. Finally, the thesis aims to integrate the proposed anytime object detection techniques into unmanned aerial vehicles and introduce a system-level scheduler capable of managing multiple anytime computation capable tasks.