Read next
Autonomous driving
New approach to mitigate the unknown environmental impact on perception
Autonomous systems are increasingly being used in various safety-critical applications, such as autonomous driving. Especially here, the environment and operating conditions are of crucial importance. Machine Learning components (e.g., Deep Neural Network (DNN) serve as enablers and therefore themselves must continue to operate reliably despite changing environmental conditions. Fraunhofer IKS provides a solution that handles such diverse and continuously changing environmental conditions.
© iStock/grandspy707
In general, an autonomous system must be aware of its environment (e.g., pedestrians and other vehicles) and actual operational context (e.g., weather conditions). Autonomous systems are increasingly using ML-based (Machine Learning) components like Deep Neural Networks (DNN) for perception tasks, which are unreliable. DNNs also lack interpretability and are sensitive to operational context changes. This reduces the autonomous system’s safety and reliability under various operating conditions.
Conference Paper, 2023
Adaptively Managing Reliability of Machine Learning Perception under Changing Operating Conditions
Abstract –
Autonomous systems are deployed in various contexts, which makes the role of the surrounding environment and operational context increasingly vital, e.g., for autonomous driving. To account for these changing operating conditions, an autonomous system must adapt its behavior to maintain safe operation and a high level of autonomy. Machine Learning (ML) components are generally being introduced for perceiving an autonomous system’s environment, but their reliability strongly depends on the actual operating conditions, which are hard to predict.
Therefore, we propose a novel approach to learn the influence of the prevalent operating conditions and use this knowledge to optimize reliability of the perception through self adaptation. Our proposed approach is evaluated in a perception case study for autonomous driving.
We demonstrate that our approach is able to improve perception under varying operating conditions, in contrast to the state-of-the-art. Besides the advantage of interpretability, our results show the superior reliability of ML-based perception.
Depending on the operating conditions, the system, however, can be (self-)adapted to use different architectural or behavioral configurations to preserve safety and reliability.
Self-adaptation to different operating conditions
This basic idea is taken up by a new approach of the Fraunhofer Institute for Cognitive Systems IKS: Self-adaptation optimizes the reliability of an autonomous system based on prevalent operating conditions.
The impact of operating conditions on different sensors is typically provided by domain-experts, although for associated DNNs this influence cannot be easily derived. Different ML components for environment perception each have advantages and disadvantages depending on the operating situation, which are not known in advance. Fraunhofer IKS approach makes it possible to learn these respective advantages and disadvantages automatically. This is made possible by the Contextual Adaptive Software Framework, which Fraunhofer IKS has developed itself. This makes it possible to use any ML model as a black box, in a kind of plug & play manner.
The new approach of Fraunhofer IKS was successfully evaluated in an application example for autonomous driving. We optimized a multi-modal sensor data fusion pipeline to yield a ten-fold improvement in perception reliability despite changing operating conditions, e.g., rain, snow, fog.