Future Vision of Dynamic Certification Schemes for Autonomous Systems
As software becomes increasingly pervasive in critical domains like autonomous driving, new challenges arise, necessitating rethinking of system engineering approaches. The gradual takeover of all critical driving functions by autonomous driving adds to the complexity of certifying these systems. Namely, certification procedures do not fully keep pace with the dynamism and unpredictability of future autonomous systems, and they may not fully guarantee compliance with the requirements imposed on these systems. In this paper, we have identified several issues with the current certification strategies that could pose serious safety risks. As an example, we highlight the inadequate reflection of software changes in constantly evolving systems and the lack of support for systems' cooperation necessary for managing coordinated movements. Other shortcomings include the narrow focus of awarded certification, neglecting aspects such as the ethical behavior of autonomous software systems. The contribution of this paper is threefold. First, we analyze the existing international standards used in certification processes in relation to the requirements derived from dynamic software ecosystems and autonomous systems themselves, and identify their shortcomings. Second, we outline six suggestions for rethinking certification to foster comprehensive solutions to the identified problems. Third, a conceptual Multi-Layer Trust Governance Framework is introduced to establish a robust governance structure for autonomous ecosystems and associated processes, including envisioned future certification schemes. The framework comprises three layers, which together support safe and ethical operation of autonomous systems.
READ FULL TEXT