A Calculus for Amortized Expected Runtimes
We develop a weakest-precondition-style calculus à la Dijkstra for reasoning about amortized expected runtimes of randomized algorithms with access to dynamic memory - the calculus. Our calculus is truly quantitative, i.e. instead of Boolean valued predicates, it manipulates real-valued functions. En route to the calculus, we study the calculus for reasoning about expected runtimes of Kaminski et al. [2018] extended by capabilities for handling dynamic memory, thus enabling compositional and local reasoning about randomized data structures. This extension employs runtime separation logic, which has been foreshadowed by Matheja [2020] and then implemented in Isabelle/HOL by Haslbeck [2021]. In addition to Haslbeck's results, we further prove soundness of the so-extended calculus with respect to an operational Markov decision process model featuring countably-branching nondeterminism, provide intuitive explanations, and provide proof rules enabling separation logic-style verification for upper bounds on expected runtimes. Finally, we build the so-called potential method for amortized analysis into the calculus, thus obtaining the calculus. Since one needs to be able to handle changes in potential which can be negative, the calculus needs to be capable of handling signed random variables. A particularly pleasing feature of our solution is that, unlike e.g. Kozen [1985], we obtain a loop rule for our signed random variables, and furthermore, unlike e.g. Kaminski and Katoen [2017], the calculus makes do without the need for involved technical machinery keeping track of the integrability of the random variables. Finally, we present case studies, including a formal analysis of a randomized delete-insert-find-any set data structure [Brodal et al. 1996].
READ FULL TEXT