Quantifying Assurance in Learning-enabled Systems

06/18/2020
by   Erfan Asaadi, et al.
0

Dependability assurance of systems embedding machine learning(ML) components—so called learning-enabled systems (LESs)—is a key step for their use in safety-critical applications. In emerging standardization and guidance efforts, there is a growing consensus in the value of using assurance cases for that purpose. This paper develops a quantitative notion of assurance that an LES is dependable, as a core component of its assurance case, also extending our prior work that applied to ML components. Specifically, we characterize LES assurance in the form of assurance measures: a probabilistic quantification of confidence that an LES possesses system-level properties associated with functional capabilities and dependability attributes. We illustrate the utility of assurance measures by application to a real world autonomous aviation system, also describing their role both in i) guiding high-level, runtime risk mitigation decisions and ii) as a core component of the associated dynamic assurance case.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset