Abstract
Long-tail effect is characterized by highly frequent occurrence of normal scenarios and the scarce appearance of extreme long-tail scenarios. Though many computer vision methods have already achieved feasible performance for most normal scenarios, it is still challenging for existing systems to accurately perceive the long-tail scenarios, which hinders the practical application of computer vision systems. In this paper, we firstly propose a theoretical framework named Long-tail Regularization (LoTR), for analyzing and tackling the long-tail problems in the vision perception of autonomous driving. Then we present a Parallel Vision Actualization System (PVAS) to search for challenging long-tail scenarios and produce large-scale long-tail driving scenarios for autonomous vehicles. In addition, we introduce how to perform PVAS in Intelligent Vehicle Future Challenge of China (IVFC), the most durable autonomous driving competition worldwide. Results over the past decade demonstrate that PVAS can effectively alleviate the impact of long-tail effect.
Original language | English |
---|---|
Journal | IEEE Transactions on Intelligent Vehicles |
Early online date | 28 Jan 2022 |
DOIs | |
Publication status | Early online - 28 Jan 2022 |