Image
Image
Image
Image
Image
Image
Image
Image
Image
Image



Search
»

Seminar abstract

Cost-sensitive Learning in Multicategory Nets of Single-Layer Perceptrons

Sarunas Raudys
Professor
Data Analysis Department
Institute of Mathematics and Informatics in Vilnius, Lithuania

Abstract :

The standard cost function of multicategory (multiclass) perceptrons does not minimize the classification error rate. In addition, the multgategory nets do not allow to take into account arbitrarily chosen costs of incorrect pair-wise classification. A further development of Rummelhart's loss function to train a net of the perceptrons with K outputs is suggested, where pairwise misclassification cost matrix can be incorporated directly. The complexity of the network remains the same; a gradient's computation of the loss function does not necessitate additional calculations. Minimization of the loss requires a smaller number of training epochs. As an alternative, a way to design the net of perceptrons on basis of K(K-1)/2 binary solutions is investigated. Particular attention is paid to training data imbalance and an accuracy of methods used to fuse K(K-1)/2 binary (pair-wise) classifications. It was found that fusion based on the Kulback–Leibler (K–L) distance and the Wu–Lin–Weng (WLW) method result in approximately the same performance in situations where sample sizes are relatively small. The explanation for this observation is by theoretically known verity that an excessive minimization of inexact criteria becomes harmful at times. Comprehensive comparative investigations of six real-world pattern recognition (PR) problems demonstrated that employment of SLP-based pairwise classifiers is comparable and as often even outperforming the support vector based classifiers in moderate dimensional situations. The colored noise injection used to design pseudo validation sets proves to be a powerful tool for facilitating finite sample problems in moderate-dimensional pattern recognition tasks.

Bio:

Professor's Sarunas Raudys research interests are multivariate analysis, data mining, statistical pattern recognition, artificial neural networks, machine learning, evolvable multiagent systems, and artificial life. Presently he is a professor in Department of Informatics, Vilnius University, a leader of Computational intelligence group. He graduated Kaunas Technical University in 1963, worked in computer designing bureau, and afterwads in Institute of Mathematics and informatics, Vilnius, Lithuania. He has spent a lot of time in Moscow, Novosibirsk, collaborating with leading USSSR scientists, including Kolmogorov lab in Moscow state university. After collapsing of USSR, he worked in several USA, Canada, France, The Netherlands, and Japan universities and research institutions. He was working in statistical pattern recognition and multivariate statistical analysis where he investigated relations between a complexity of the classification algorithms, learning sample size and generalization error. Since 1990 he was interested in artificial naural networks theory, where he showed that while training the perceptrons one can obtain a number of well known and popular statistical classifiers and regression methods. He suggested a way how to integrate statistics and neural networks in their training process. Presently he started analysing a collective learning of a populations of autonomous perceptrons (intellectual agents) functioning in changing environments. Pdf files of Sarunas Raudys most recent papers can be found in his University web page http://www.mif.vu.lt/katedros/cs/Staff/VisiI.htm.
  Name Size

Image
PoweredBy © LAMDA, 2022