Real-time Driving Context Understanding using Deep Grid Net: A
Granular Approach
Numerous self-driving cars algorithms rely on grid maps for motion planning, obstacles avoidance, or environment perception. Obtained from fused sensory information, the occupancy grids (OGs) are nowadays among the most popular solutions used in series production in the automotive industry. In this paper, we extend Deep Grid Net (DGN) [18], a deep learning (DL) system designed for understanding the context in which an autonomous car is driving. We consider this paper a granular approach to DGN method due to the improvements added to the original research [18]. DGN incorporates a learned driving environment representation based on OGs obtained from raw real-world Lidar data and constructed on top of the Dempster-Shafer (DS) theory. Our system is able to predict in real-time if the vehicle is driving on the highway, on county roads, inside a city, in parking lots or is stuck in a traffic jam. The predicted driving context is further used for switching between different autonomous driving strategies implemented within EB robinos, Elektrobit’s Autonomous Driving (AD) software platform. We propose a neuroevolutionary approach to search the optimal hyperparameters set of DGN. Genetic algorithms (GAs) were selected due to their demonstrated capabilities to evolve deep neural networks with improved accuracy and processing speed. The performance of the proposed deep network has been evaluated against similar competing driving context estimation classifiers.