Wechsler | Neural Networks for Perception | E-Book | www.sack.de
E-Book

E-Book, Englisch, 384 Seiten, Web PDF

Wechsler Neural Networks for Perception

Computation, Learning, and Architectures
1. Auflage 2014
ISBN: 978-1-4832-6279-6
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark

Computation, Learning, and Architectures

E-Book, Englisch, 384 Seiten, Web PDF

ISBN: 978-1-4832-6279-6
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark



Neural Networks for Perception, Volume 2: Computation, Learning, and Architectures explores the computational and adaptation problems related to the use of neuronal systems, and the corresponding hardware architectures capable of implementing neural networks for perception and of coping with the complexity inherent in massively distributed computation. This book addresses both theoretical and practical issues related to the feasibility of both explaining human perception and implementing machine perception in terms of neural network models. The text is organized into two sections. The first section, computation and learning, discusses topics on learning visual behaviors, some of the elementary theory of the basic backpropagation neural network architecture, and computation and learning in the context of neural network capacity. The second section is on hardware architecture. The chapters included in this part of the book describe the architectures and possible applications of recent neurocomputing models. The Cohen-Grossberg model of associative memory, hybrid optical/digital architectures for neorocomputing, and electronic circuits for adaptive synapses are some of the subjects elucidated. Neuroscientists, computer scientists, engineers, and researchers in artificial intelligence will find the book useful.

Wechsler Neural Networks for Perception jetzt bestellen!

Autoren/Hrsg.


Weitere Infos & Material


1;Front Cover;1
2;Computation, Learning, and Architectures;4
3;Copyright Page;5
4;Table of Contents;8
5;Dedication;6
6;Contents of Volume 1;10
7;Contributors;14
8;Foreword;16
9;PART III: Computation and Learning;22
9.1;III. Introduction;24
9.2;Chapter
III.1 Learning Visual Behaviors;29
9.2.1;1 Introduction;29
9.2.2;2 Situated Learning Systems;34
9.2.3;3 Deictic Representations;40
9.2.4;4 Perceptual Aliasing;46
9.2.5;5 Results;49
9.2.6;6 Discussion and Conclusions;55
9.2.7;References;58
9.3;Chapter
III.2 Nonparametric Regression Analysis Using Self-Organizing Topological Maps;61
9.3.1;I. Introduction;61
9.3.2;II. Self-Organizing Topological Maps;64
9.3.3;III. Constrained Topological Mapping;66
9.3.4;IV. Simulation Results;69
9.3.5;V. Summary and Discussion;72
9.3.6;References;74
9.4;Chapter
III.3 Theory of the Backpropagation Neural Network;86
9.4.1;I Introduction;86
9.4.2;II Backpropagation Neural Network Architecture;88
9.4.3;Ill Backpropagation Error Surfaces;90
9.4.4;IV Function Approximation with Backpropagation;96
9.4.5;V Conclusions;104
9.4.6;Appendix;106
9.4.7;References;110
9.5;Chapter
III.4 Hopfield Model and Optimization Problems;115
9.5.1;1 Introduction;115
9.5.2;2 The Traveling Salesman Problem;116
9.5.3;3 TSP Simulations;118
9.5.4;4 Scaling of TSP Solutions;123
9.5.5;5 The Clustering Problem;126
9.5.6;6 Concluding Remarks;129
9.5.7;References;130
9.6;Chapter
III.5 DAM, Regression Analysis, and Attentive Recognition;132
9.6.1;I. Introduction;132
9.6.2;II. The Moore-Penrose Associative Memory;134
9.6.3;III. Relationship between DAM and Multiple Linear Regression;136
9.6.4;IV. Selective and Focused Active Recognition;137
9.6.5;V. Experimental Results;140
9.6.6;VI. Conclusion;142
9.6.7;References;143
9.7;Chapter
III.6 INTELLIGENCE CODE MACHINE;149
9.7.1;I. Introduction;149
9.7.2;II. Formulation of the Intelligence Code;151
9.7.3;III. Examples;162
9.7.4;IV. Conclusion;166
9.7.5;Acknowledgment;166
9.7.6;References;167
9.8;Chapter
III.7 Cycling Logarithmically Converging Networks That Flow Information to Behave (Perceive) and Learn;168
9.8.1;I. Introduction;168
9.8.2;II. Logarithmically Converging Parallel-Hierarchical;170
9.8.3;III. Connectionist Networks Defined;172
9.8.4;IV. Adding Back-Links and Nodes to Feed-Forward Nets, to Handle Processes Other than Bottom-Up;174
9.8.5;V. The Variety of Functions That Cycling Back-Links Can Handle;177
9.8.6;VI. Learning, by Passing Feedback Over the Same Back Links Used for Behaving;181
9.8.7;VII. Handling Flows of Information from Many Sources That Continue Over Time;187
9.8.8;VIII. Feedback Must Always Be Perceived and Evaluated, Through the Same or Different Sensors;189
9.8.9;IX. Discussion and Conclusions;190
9.8.10;Acknowledgements;191
9.8.11;References;191
9.9;Chapter
III.8 Computation and Learning in the Context of Neural Network Capacity;194
9.9.1;I A Tale of Three Queries;194
9.9.2;II The Neural Model;196
9.9.3;Ill What Problems Can Neural Networks Compute?;198
9.9.4;IV CAPACITY;202
9.9.5;V Learning;211
9.9.6;VI Feedforward Networks;214
9.9.7;VII Feedback Networks;219
9.9.8;VIII Conclusions and Acknowledgements;223
9.9.9;IX Bibliographical Notes;224
9.9.10;References;226
10;PART IV: Architectures;230
10.1;IV. Introduction;232
10.2;Chapter
IV.1 Competitive and Cooperative Multimode Dynamics in Photorefractive Ring Circuits;235
10.2.1;I. Introduction;235
10.2.2;II. Photorefractive Two-Beam Coupling;239
10.2.3;III. Dynamics ofa Ring Circuit with Photorefractive Gain and Loss Elements;245
10.2.4;IV. The Photorefractive Flip-Flop;255
10.2.5;V. Dynamics of Competitive Optical Networks: Winner-Takes-All Dynamics and the Voter's Paradox;261
10.2.6;VI. Conclusion;271
10.2.7;Acknowledgments;271
10.2.8;References;271
10.3;Chapter
IV.2 HYBRID NEURAL NETWORKS AND ALGORITHMS;274
10.3.1;I. INTRODUCTION;274
10.3.2;II. HYBRID OPTICAL/DIGITAL MULTIFUNCTIONAL NN ARCHITECTURE;279
10.3.3;III. ASSOCIATIVE PROCESSOR (AP NNs);282
10.3.4;IV. OPTIMIZATION NNs;287
10.3.5;V. SYMBOLIC CORRELATOR AND PRODUCTION SYSTEM NNs;292
10.3.6;VI. ADAPTIVE LEARNING NEURAL NET;293
10.3.7;VII. SUMMARY AND CONCLUSION;299
10.3.8;REFERENCES;300
10.4;Chapter
IV.3 The Use of Fixed Holograms for Massively-Interconnected, Low-Power Neural Networks;303
10.4.1;1 Introduction;304
10.4.2;2 Basic Concepts;306
10.4.3;3 Weighted Interconnections of N2 to 1;309
10.4.4;4 Parallel AT4 Weighted Interconnections
and Crosstalk;312
10.4.5;5 Diffraction and Crosstalk Considerations;315
10.4.6;6 Applications;323
10.4.7;7 Discussions and Conclusions;326
10.4.8;References;328
10.5;Chapter
IV.4 Electronic Circuits for Adaptive Synapses;331
10.5.1;I. Introduction;331
10.5.2;II. Discrete Valued Weight Implementations;332
10.5.3;III. Continuous Valued Weight Implementations;348
10.5.4;IV. Conclusion;352
10.5.5;References;353
10.6;Chapter
IV.5 Neural Network Computations On A Fine Grain Array Processor;356
10.6.1;I. Introduction;356
10.6.2;II. Bit Serial Array processors;358
10.6.3;III. General Networks;362
10.6.4;IV. Translation Invariant Networks;366
10.6.5;V. Conclusions;378
10.6.6;References;379
11;Index;382



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.