Oracle Complexity and Nontransitivity in Pattern Recognition

10/16/2000
by   Vadim Bulitko, et al.
0

Different mathematical models of recognition processes are known. In the present paper we consider a pattern recognition algorithm as an oracle computation on a Turing machine. Such point of view seems to be useful in pattern recognition as well as in recursion theory. Use of recursion theory in pattern recognition shows connection between a recognition algorithm comparison problem and complexity problems of oracle computation. That is because in many cases we can take into account only the number of sign computations or in other words volume of oracle information needed. Therefore, the problem of recognition algorithm preference can be formulated as a complexity optimization problem of oracle computation. Furthermore, introducing a certain "natural" preference relation on a set of recognizing algorithms, we discover it to be nontransitive. This relates to the well known nontransitivity paradox in probability theory. Keywords: Pattern Recognition, Recursion Theory, Nontransitivity, Preference Relation

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset