Intelligence Emerging: Adaptivity and Search in Evolving Neural Systems
504Intelligence Emerging: Adaptivity and Search in Evolving Neural Systems
504Paperback(New Edition)
-
SHIP THIS ITEMChoose Expedited Shipping at checkout for delivery by Thursday, April 4PICK UP IN STORECheck Availability at Nearby Stores
Available within 2 business hours
Related collections and offers
Overview
Emergence—the formation of global patterns from solely local interactions—is a frequent and fascinating theme in the scientific literature both popular and academic. In this book, Keith Downing undertakes a systematic investigation of the widespread (if often vague) claim that intelligence is an emergent phenomenon. Downing focuses on neural networks, both natural and artificial, and how their adaptability in three time frames—phylogenetic (evolutionary), ontogenetic (developmental), and epigenetic (lifetime learning)—underlie the emergence of cognition. Integrating the perspectives of evolutionary biology, neuroscience, and artificial intelligence, Downing provides a series of concrete examples of neurocognitive emergence. Doing so, he offers a new motivation for the expanded use of bio-inspired concepts in artificial intelligence (AI), in the subfield known as Bio-AI.
One of Downing's central claims is that two key concepts from traditional AI, search and representation, are key to understanding emergent intelligence as well. He first offers introductory chapters on five core concepts: emergent phenomena, formal search processes, representational issues in Bio-AI, artificial neural networks (ANNs), and evolutionary algorithms (EAs). Intermediate chapters delve deeper into search, representation, and emergence in ANNs, EAs, and evolving brains. Finally, advanced chapters on evolving artificial neural networks and information-theoretic approaches to assessing emergence in neural systems synthesize earlier topics to provide some perspective, predictions, and pointers for the future of Bio-AI.
Product Details
ISBN-13: | 9780262536844 |
---|---|
Publisher: | MIT Press |
Publication date: | 05/29/2015 |
Series: | The MIT Press |
Edition description: | New Edition |
Pages: | 504 |
Product dimensions: | 7.00(w) x 9.00(h) x 0.88(d) |
Age Range: | 18 Years |
About the Author
Table of Contents
Preface xiii
Acknowledgments xix
1 Introduction 1
1.1 Exploration and Emergence 1
1.2 Intelligence 4
1.3 Adaptivity 6
1.4 I Am, Therefore I Think 8
1.5 Evolving Neural Networks 12
1.6 Deciphering Knowledge and Gauging Complexity 13
1.7 Simon's Great Ant 14
1.8 Search in Carbon and Silicon 17
1.9 Traveling Light 19
2 Emergence 21
2.1 Complex Adaptive Systems 21
2.2 Chaos and Complexity 25
2.3 Levels of Emergence 29
2.3.1 Deacon's Autogen 32
2.4 Emergence and Search 36
2.5 Emergent Intelligence 39
3 Search: The Core of AI 43
3.1 Design Search 43
3.2 Classic AI Search Techniques 44
3.2.1 Best-First Search 48
3.2.1 Steiner Trees Using Best-First Search 52
3.2.2 Search among Whole Solutions 54
3.2.3 Local Search for Steiner Trees 56
3.3 Steiner Brains 61
3.4 Biological Search and Problem Solving 63
4 Representations for Search and Emergence 67
4.1 Knowledge Representations 67
4.2 Models of Intelligence 68
4.3 Representation and Reasoning 71
4.4 Bio-Inspired Representations for Local Search 74
4.5 A Representation Puzzle 77
4.6 A Common Framework for Emergent AI 80
4.7 Representations for Evolving Artificial Neural Networks 82
4.8 Representational Trade-offs 84
5 Evolutionary Algorithms 85
5.1 Darwinian Evolution 85
5.2 Artificial Evolution in a Computer 87
5.3 Spy versus Spy 92
5.4 The Main Classes of Evolutionary Algorithms 94
5.4.1 Evolutionary Strategies 94
5.4.2 Evolutionary Programming 95
5.4.3 Genetic Algorithms 96
5.4.4 Genetic Programming 97
5.4.5 Convergence of EA Types 98
5.5 Let Evolution Figure It Out 98
6 Artificial Neural Networks 101
6.1 The Mental Matrix 101
6.2 The Physiology of Information Transmission 102
6.3 The Physiological Basis of Learning 106
6.3.1 Basic Biochemistry of Synaptic Change 106
6.3.2 Long-Term Potentiation 110
6.4 Abstractions and Models 111
6.4.1 Spatial Abstractions 112
6.4.2 Functional Abstractions 115
6.4.3 Temporal Abstractions 121
6.4.4 The Spike Response Model (SRM) 124
6.4.5 The Izhikevich Model 126
6.4.6 Continuous-Time Recurrent Neural Networks 126
6.5 Net Picking 129
7 Knowledge Representation in Neural Networks 131
7.1 Activity, Connectivity, and Correlation 131
7.2 Representations in Firing Patterns 133
7.2.1 The Syntax of Neural Information 134
7.2.2 Pattern Completion in Distributed Memories 135
7.2.3 Sequential Pattern Retrieval 136
7.2.4 Connecting Patterns: The Semantics of Neural Information 137
7.2.5 Hierarchical Cluster Analysis 142
7.3 Representations in Synaptic Matrices 144
7.3.1 Principal Component Analysis 150
7.4 Neuroarchitectures to Realize Effective Distributed Coding 155
7.4.1 Pattern Processing with Dense Distributed Codes 158
7.4.2 Cortical-Hippocampal Interactions 162
7.4.3 Third-Order Emergence in the Brain 163
7.5 The Fallacy of Untethered Knowledge 163
8 Search and Representation in Evolutionary Algorithms 165
8.1 Search as Resource Allocation 165
8.2 Evolutionary Search 166
8.3 Exploration versus Exploitation in Evolutionary Search 171
8.4 Representations for Evolutionary Algorithms 175
8.4.1 Data-Oriented Genotypes 178
8.4.2 Program-Oriented Genotypes 192
8.5 Bio-AI's Head Designer 197
9 Evolution and Development of the Brain 199
9.1 Neurons in a Haystack 199
9.2 The Neuromeric Model and Hox Genes 200
9.3 Neural Darwinism and Displacement Theory 203
9.4 Facilitated Variation 206
9.4.2 Modularity 207
9.4.2 Weak Linkage 208
9.4.3 Exploratory Growth 211
9.4.4 Emerging Topological Maps 214
9.4.5 Deconstraining Evolution 217
9.5 Representation and Search in Neural Evolution and Development 219
9.6 The Bio-Inspiration 221
9.6.1 Genetic Regulatory Networks 221
9.6.2 Conserved Core Processes 223
9.6.3 Weak Linkage and Tags 224
9.6.4 Interacting Adaptive Mechanisms 226
9.6.5 Solving Network Problems with Fruit Flies 227
9.7 Appreciating Add-Hox Design 229
10 Learning via Synaptic Tuning 231
10.1 A Good Hebb Start 231
10.2 Supervised. Learning in Artificial Neural Networks 231
10.3 Hebbian Learning Models 235
10.31 Weight Normalization 237
10.4 Unsupervised ANNs for AI 239
10.4.1 Hopfield Networks 239
10.4.2 Basic Computations for Hopfield Networks 242
10.4.3 Search in Hopfield Networks 243
10.4.4 Hopfield Search in the Brain 246
10.4.5 Competitive Networks 247
10.4.6 Self-Organizing Maps 251
10.5 Spikes and Plasticity 254
10.5.1 Prediction 254
10.5.2 Spike-Timing-Dependent Plasticity (STDP) 256
10.5.3 Emergent Predictive Circuits 262
10.6 Place Cells and Prediction in the Hippocampus 267
10.7 Neural Support for Cognitive Incrementalism 271
10.8 Hebb Still Rules 273
11 Trial-and-Error Learning in Neural Networks 275
11.1 Credit Assignment 275
11.2 Reinforcement Learning Theory 278
11.2.1 The State Space for RL 279
11.2.2 Three RL Method? and Their Backup Routines 283
11.2.3 Temporal Difference (TD) Learning 286
11.2.4 Actor-Critic Methods 289
11.3 Reinforcement Learning and Neural Networks 291
11.3.1 Reinforcement learning via Artificial Neural Networks 291
11.3.2 TD-Gammon 293
11.3.3 Artificial Neural Networks That Adapt using Reinforcement Learning 295
11.3.4 The Emergence of RL Mechanisms 298
11.3.5 Reinforcement Learning in the Basal Ganglia 302
11.3.6 An Actor-Critic Model of the Basal Ganglia 305
11.4 Experience Needed 308
12 Evolving Artificial Neural Networks 311
12.1 Classifying EANNs 312
12.2 Early EANN Systems 313
12.3 Direct Encodings 320
12.3.1 Competing Conventions Problem (CCP) 320
12.3.2 SANE 321
12.3.3 NEAT 324
12.3.4 Spcciation for Complexification 327
12.3.5 Cartesian Genetic Programming 330
12.4 Indirect Encodings 331
12.4.1 Cellular Encoding 334
12.4.2 G2L 335
12.4.3 Hyper NEAT 339
12.5 Evolvability of EANNs 343
12.5.1 Duplication and Differentiation 343
12.5.2 Weak Linkage 346
12.5.3 Modularity 349
12.5.4 Exploratory Growth 353
12.6 The Model of Evolving Neural Aggregates 356
12.7 Picking Prophets 363
13 Recognizing Emergent Intelligence 365
13.1 Bits and Brains 365
13.2 Formalizing Diverse Correlations 366
13.3 Information and Probability 368
13.3.1 Entropy 372
13.3.2 Information as Entropy Reduction 373
13.3.3 Mutual Information 375
13.3.4 Conditional Entropy 378
13.3.5 Venn Diagrams for Information Theory 379
13.3.6 Conditional Mutual Information 380
13.4 Information Flow through Time 382
33.4.1 Entropy Rate 383
13.4.1 Transfer Entropy 384
13.5 Information Gain, Loss, and Flow 385
13.6 Quantifying Neural Complexity 393
13.6.1 TSE Complexity 394
13.6.2 Degeneracy and Redundancy 396
13.6.3 Matching Complexity 400
33.6.4 Effective Information and Information Integration 401
13.6.4 Entropy and Self-Organization 403
13.7 Converging Complexities 406
14 Conclusion 409
14.1 Circuits in Cycles in Loops 409
14.2 Sources and Seeds of Complexity 412
14.3 Representations 413
14.4 Nurturing Nature in Our Machines 415
A Evolutionary Algorithm Sidelights 419
A.1 Essential Components of an Evolutionary Algorithm 419
A.2 Setting EA Parameters 422
A.3 Fitness Assessment 424
A.4 Natural and Artificial Selection 426
A.4.1 Fitness Variance 426
A.4.2 Selection Pressure 427
A.5 Selection Strategies for Evolutionary Algorithms 428
A.5.1 Adult Selection 429
A.5.2 Parent Selection 429
A.5.3 Supplements to Selection 435
B Neural Network Sidelights 437
B.1 Neural Networks as Complex Mappings 437
B.2 k-m Predictability 443
B.3 Combinatorics of Expansion Recoding 446
B.4 Interference Reduction via Sparse Coding 446
B.5 Weight Instabilities 448
B.6 Redundancy and Degeneracy 450
Notes 453
References 457
Index 471
What People are Saying About This
In this deep and broad book, Downing takes up the challenge of explaining how intelligence emerges, in the evolutionary, developmental, and learning senses. Drawing together evidence from neuroscience, computational neuroscience, classical AI, and connectionism, Downing focuses on search and representation as defining activities of intelligence, and as critical prerequisites to emergence in both the human brain and in computational models. This is a detailed and rich approach that will serve as an example of careful thought and scholarship in this area. Moreover, Downing's work will provide a strong technical foundation for continued scientific analysis of the emergence of intelligence.
Early on in the book Downing formulates his ambition: to bring emergence to the forefront and to keep it there. The result is an impressively wide-ranging and engaging exploration of complex adaptive systems in three time frames: evolution, development, and learning. I warmly recommend this book to anyone with a serious interest in embodied cognition and bio-inspired artificial intelligence.
In this refreshing and forward-thinking exploration of natural and artificial intelligence, Keith Downing focuses on the emergence of adaptive complexity in learning, development, and evolution. While the discussion is rooted in traditional concepts of search and representation, the book charts an ambitious path for biologically inspired artificial intelligence in the future.
Lee Spector, Professor of Computer Science, Hampshire College; author of Automatic Quantum ComputerProgramming: A Genetic Programming Approach; Editor-in-chief of Genetic Programming and Evolvable Machines
In this deep and broad book, Downing takes up the challenge of explaining how intelligence emerges, in the evolutionary, developmental, and learning senses. Drawing together evidence from neuroscience, computational neuroscience, classical AI, and connectionism, Downing focuses on search and representation as defining activities of intelligence, and as critical prerequisites to emergence in both the human brain and in computational models. This is a detailed and rich approach that will serve as an example of careful thought and scholarship in this area. Moreover, Downing's work will provide a strong technical foundation for continued scientific analysis of the emergence of intelligence.
Jeff Shrager, Senior Fellow, CommerceNet; Consulting Professor of Symbolic Systems, Stanford UniversityEarly on in the book Downing formulates his ambition: to bring emergence to the forefront and to keep it there. The result is an impressively wide-ranging and engaging exploration of complex adaptive systems in three time frames: evolution, development, and learning. I warmly recommend this book to anyone with a serious interest in embodied cognition and bio-inspired artificial intelligence.
Tom Ziemke, Professor of Cognitive Science, University of Skövde and Linköping University, SwedenIn this refreshing and forward-thinking exploration of natural and artificial intelligence, Keith Downing focuses on the emergence of adaptive complexity in learning, development, and evolution. While the discussion is rooted in traditional concepts of search and representation, the book charts an ambitious path for biologically inspired artificial intelligence in the future.
Lee Spector, Professor of Computer Science, Hampshire College; author of Automatic Quantum ComputerProgramming: A Genetic Programming Approach; Editor-in-chief of Genetic Programming and Evolvable MachinesIn this refreshing and forward-thinking exploration of natural and artificial intelligence, Keith Downing focuses on the emergence of adaptive complexity in learning, development, and evolution. While the discussion is rooted in traditional concepts of search and representation, the book charts an ambitious path for biologically inspired artificial intelligence in the future.