WHY CLASSICAL MODELS FOR PATTERN RECOGNITION ARE NOT PATTERN RECOGNITION MODELS
Lev Goldfarb and Jaroslav Hook
Faculty of Computer Science
(International Conference on Advances in Pattern Recognition, ed. Sameer Singh, Springer, pp.405–414, 1998)
ABSTRACT. In this paper we outline a simple explanation of why, we think, the classical, or vector-space-based (including the artificial neural net) models for pattern recognition are fundamentally inadequate as such. The present simple explanation of this inadequacy is based on a radically new understanding of the nature of inductive learning processes. The latter became possible only after a careful analysis of the axiomatic foundations of a new inductive learning model proposed by the first author in 1990 which overcomes the above limitations. The new model¾evolving transformations system¾has emerged as a result of a 13-year long attempt to find a mathematical framework that would unify two main and structurally different approaches to pattern recognition: the vector-space-based and the syntactic approaches. The decisive deficiency of the classical vector-space-based pattern recognition model, as it turns out, relates to the intrinsic inability of the underlying mathematical model, i.e. the normed vector space, to accommodate, during the learning process in a realistic environment, the discovery of the corresponding class distance function under more general than numeric, symbolic, pattern representation. Typically, such symbolic distance functions have very little to do with a very restricted, “Euclidean”, class of distance functions, which due to the underlying algebraic structure of the vector space are unavoidably associated with this form of pattern representation. In other words, the more general class of symbolic distance functions is incomparably larger than that consistent with the vector space structure, and so the discovery and construction of the appropriate class distance function during the learning process simply cannot proceed in the vector space setting.