|MAJID MOJIRSHEIBANI, School of Mathematics and Statistics, Carleton University, Ottawa, Ontario K1S 5B6, Canada|
|Combined estimation and probabilistic classification|
There has recently been a growing interest in combining different classifiers in order to develop more effective classification rules with higher predictive power. The individual classifiers could be, for example, tree classifiers, partitioning rules, Fisher's linear discriminant function, or nearest neighbour classifiers, to name a few. Combining classifiers may also be viewed as a partial answer to the question of: Given a few classification rules, which one should the user choose if his/her main concern is to have a low error rate? Quite often, in a given situation, one classifier performs better than another; the reason can be directly related to the nature of the underlying parent distributions of the classes involved.
In this talk we will propose a new combining procedure which is quite simple to use in practice. We will also show that under certain conditions, the proposed combined classifier is asymptotically, strongly, at least as good as any one of the individual classifiers.