Raudys | Statistical and Neural Classifiers | Buch | 978-1-85233-297-6 | sack.de

Buch, Englisch, 295 Seiten, Format (B × H): 161 mm x 241 mm, Gewicht: 620 g

Reihe: Advances in Computer Vision and Pattern Recognition

Raudys

Statistical and Neural Classifiers

An Integrated Approach to Design
2001. Auflage 2001
ISBN: 978-1-85233-297-6
Verlag: Springer

An Integrated Approach to Design

Buch, Englisch, 295 Seiten, Format (B × H): 161 mm x 241 mm, Gewicht: 620 g

Reihe: Advances in Computer Vision and Pattern Recognition

ISBN: 978-1-85233-297-6
Verlag: Springer


Automatic (machine) recognition, description, classification, and groupings of patterns are important problems in a variety of engineering and scientific disciplines such as biology, psychology, medicine, marketing, computer vision, artificial intelligence, and remote sensing. Given a pattern, its recognition/classification may consist of one of the following two tasks: (1) supervised classification (also called discriminant analysis); the input pattern is assigned to one of several predefined classes, (2) unsupervised classification (also called clustering); no pattern classes are defined a priori and patterns are grouped into clusters based on their similarity. Interest in the area of pattern recognition has been renewed recently due to emerging applications which are not only challenging but also computationally more demanding (e. g., bioinformatics, data mining, document classification, and multimedia database retrieval). Among the various frameworks in which pattern recognition has been traditionally formulated, the statistical approach has been most intensively studied and used in practice. More recently, neural network techniques and methods imported from statistical learning theory have received increased attention. Neural networks and statistical pattern recognition are two closely related disciplines which share several common research issues. Neural networks have not only provided a variety of novel or supplementary approaches for pattern recognition tasks, but have also offered architectures on which many well-known statistical pattern recognition algorithms can be mapped for efficient (hardware) implementation. On the other hand, neural networks can derive benefit from some well-known results in statistical pattern recognition.

Raudys Statistical and Neural Classifiers jetzt bestellen!

Zielgruppe


Research


Autoren/Hrsg.


Weitere Infos & Material


1. Quick Overview.- 1.1 The Classifier Design Problem.- 1.2 Single Layer and Multilayer Perceptrons.- 1.3 The SLP as the Euclidean Distance and the Fisher Linear Classifiers.- 1.4 The Generalisation Error of the EDC and the Fisher DF.- 1.5 Optimal Complexity — The Scissors Effect.- 1.6 Overtraining in Neural Networks.- 1.7 Bibliographical and Historical Remarks.- 2. Taxonomy of Pattern Classification Algorithms.- 2.1 Principles of Statistical Decision Theory.- 2.2 Four Parametric Statistical Classifiers.- 2.3 Structures of the Covariance Matrices.- 2.4 The Bayes Predictive Approach to Design Optimal Classification Rules.- 2.5. Modifications of the Standard Linear and Quadratic DF.- 2.6 Nonparametric Local Statistical Classifiers.- 2.7 Minimum Empirical Error and Maximal Margin Linear Classifiers.- 2.8 Piecewise-Linear Classifiers.- 2.9 Classifiers for Categorical Data.- 2.10 Bibliographical and Historical Remarks.- 3. Performance and the Generalisation Error.- 3.1 Bayes, Conditional, Expected, and Asymptotic Probabilities of Misclassification.- 3.2 Generalisation Error of the Euclidean Distance Classifier.- 3.3 Most Favourable and Least Favourable Distributions of the Data.- 3.4 Generalisation Errors for Modifications of the Standard Linear Classifier.- 3.5 Common Parameters in Different Competing Pattern Classes.- 3.6 Minimum Empirical Error and Maximal Margin Classifiers.- 3.7 Parzen Window Classifier.- 3.8 Multinomial Classifier.- 3.9 Bibliographical and Historical Remarks.- 4. Neural Network Classifiers.- 4.1 Training Dynamics of the Single Layer Perceptron.- 4.2 Non-linear Decision Boundaries.- 4.3 Training Peculiarities of the Perceptrons.- 4.4 Generalisation of the Perceptrons.- 4.5 Overtraining and Initialisation.- 4.6 Tools to Control Complexity.- 4.7 TheCo-Operation of the Neural Networks.- 4.8 Bibliographical and Historical Remarks.- 5. Integration of Statistical and Neural Approaches.- 5.1 Statistical Methods or Neural Nets?.- 5.2 Positive and Negative Attributes of Statistical Pattern Recognition.- 5.3 Positive and Negative Attributes of Artificial Neural Networks.- 5.4 Merging Statistical Classifiers and Neural Networks.- 5.5 Data Transformations for the Integrated Approach.- 5.6 The Statistical Approach in Multilayer Feed-forward Networks.- 5.7 Concluding and Bibliographical Remarks.- 6. Model Selection.- 6.1 Classification Errors and their Estimation Methods.- 6.2 Simplified Performance Measures.- 6.3 Accuracy of Performance Estimates.- 6.4 Feature Ranking and the Optimal Number of Feature.- 6.5 The Accuracy of the Model Selection.- 6.6 Additional Bibliographical Remarks.- Appendices.- A.1 Elements of Matrix Algebra.- A.2 The First Order Tree Type Dependence Model.- A.3 Temporal Dependence Models.- A.4 Pikelis Algorithm for Evaluating Means and Variances of the True, Apparent and Ideal Errors in Model Selection.- A.5 Matlab Codes (the Non-Linear SLP Training, the First Order Tree Dependence Model, and Data Whitening Transformation).- References.



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.