Atjaunināt sīkdatņu piekrišanu

Introductory Course in Computational Neuroscience [Hardback]

4.44/5 (18 ratings by Goodreads)
(Brandeis University)
  • Formāts: Hardback, 408 pages, height x width: 229x178 mm, 127 b&w illus.
  • Sērija : Computational Neuroscience Series
  • Izdošanas datums: 02-Oct-2018
  • Izdevniecība: MIT Press
  • ISBN-10: 0262038250
  • ISBN-13: 9780262038256
Citas grāmatas par šo tēmu:
  • Hardback
  • Cena: 70,32 €
  • Grāmatu piegādes laiks ir 3-4 nedēļas, ja grāmata ir uz vietas izdevniecības noliktavā. Ja izdevējam nepieciešams publicēt jaunu tirāžu, grāmatas piegāde var aizkavēties.
  • Daudzums:
  • Ielikt grozā
  • Piegādes laiks - 4-6 nedēļas
  • Pievienot vēlmju sarakstam
  • Formāts: Hardback, 408 pages, height x width: 229x178 mm, 127 b&w illus.
  • Sērija : Computational Neuroscience Series
  • Izdošanas datums: 02-Oct-2018
  • Izdevniecība: MIT Press
  • ISBN-10: 0262038250
  • ISBN-13: 9780262038256
Citas grāmatas par šo tēmu:

A textbook for students with limited background in mathematics and computer coding, emphasizing computer tutorials that guide readers in producing models of neural behavior.

This introductory text teaches students to understand, simulate, and analyze the complex behaviors of individual neurons and brain circuits. It is built around computer tutorials that guide students in producing models of neural behavior, with the associated Matlab code freely available online. From these models students learn how individual neurons function and how, when connected, neurons cooperate in a circuit. The book demonstrates through simulated models how oscillations, multistability, post-stimulus rebounds, and chaos can arise within either single neurons or circuits, and it explores their roles in the brain.

The book first presents essential background in neuroscience, physics, mathematics, and Matlab, with explanations illustrated by many example problems. Subsequent chapters cover the neuron and spike production; single spike trains and the underlying cognitive processes; conductance-based models; the simulation of synaptic connections; firing-rate models of large-scale circuit operation; dynamical systems and their components; synaptic plasticity; and techniques for analysis of neuron population datasets, including principal components analysis, hidden Markov modeling, and Bayesian decoding.

Accessible to undergraduates in life sciences with limited background in mathematics and computer coding, the book can be used in a “flipped” or “inverted” teaching approach, with class time devoted to hands-on work on the computer tutorials. It can also be a resource for graduate students in the life sciences who wish to gain computing skills and a deeper knowledge of neural function and neural circuits.



A textbook for students with limited background in mathematics and computer coding, emphasizing computer tutorials that guide readers in producing models of neural behavior.
Series Foreword xiii
Acknowledgments xv
Preface xvii
1 Preliminary Material 1(58)
1.1 Introduction
1(3)
1.1.1 The Cell, the Circuit, and the Brain
1(1)
1.1.2 Physics of Electrical Circuits
1(1)
1.1.3 Mathematical Preliminaries
2(2)
1.1.4 Writing Computer Code
4(1)
1.2 The Neuron, the Circuit, and the Brain
4(7)
1.2.1 The Cellular Level
4(3)
1.2.2 The Circuit Level
7(1)
1.2.3 The Regional Level
8(3)
1.3 Physics of Electrical Circuits
11(3)
1.3.1 Terms and Properties
11(1)
1.3.2 Pumps, Reservoirs, and Pipes
12(1)
1.3.3 Some Peculiarities of the Electrical Properties of Neurons
13(1)
1.4 Mathematical Background
14(22)
1.4.1 Ordinary Differential Equations
15(9)
1.4.2 Vectors, Matrices, and Their Basic Operations
24(4)
1.4.3 Probability and Bayes' Theorem
28(8)
1.5 Introduction to Computing and MATLAB
36(15)
1.5.1 Basic Commands
37(1)
1.5.2 Arrays
38(2)
1.5.3 Allocation of Memory
40(1)
1.5.4 Using the Colon (:) Symbol
41(1)
1.5.5 Saving Your Work
42(1)
1.5.6 Plotting Graphs
42(1)
1.5.7 Vector and Matrix Operations in MATLAB
43(1)
1.5.8 Conditionals
44(2)
1.5.9 Loops
46(1)
1.5.10 Functions
47(1)
1.5.11 Some Operations Useful for Modeling Neurons
48(1)
1.5.12 Good Coding Practice
49(2)
1.6 Solving Ordinary Differential Equations (ODEs)
51(8)
1.6.1 Forward Euler Method
51(1)
1.6.2 Simulating ODEs with MATLAB
52(2)
1.6.3 Solving Coupled ODEs with Multiple Variables
54(1)
1.6.4 Solving ODEs with Nested for Loops
55(1)
1.6.5 Comparing Simulation Methods
55(1)
1.6.6 Euler-Mayamara Method: Forward Euler with White Noise
56(3)
2 The Neuron and Minimal Spiking Models 59(30)
2.1 The Nernst Equilibrium Potential
59(3)
2.2 An Equivalent Circuit Model of the Neural Membrane
62(4)
2.2.1 Depolarization versus Hyperpolarization
65(1)
2.3 The Leaky Integrate-and-Fire Model
66(4)
2.3.1 Specific versus Absolute Properties of the Cell
68(1)
2.3.2 Firing Rate as a Function of Current (f-I Curve) of the Leaky Integrate-and-Fire Model
69(1)
2.4 Tutorial 2.1: The f-I Curve of the Leaky Integrate-and-Fire Neuron
70(2)
2.5 Extensions of the Leaky Integrate-and-Fire Model
72(4)
2.5.1 Refractory Period
72(2)
2.5.2 Spike-Rate Adaptation (SRA)
74(2)
2.6 Tutorial 2.2: Modeling the Refractory Period
76(2)
2.7 Further Extensions of the Leaky Integrate-and-Fire Model
78(3)
2.7.1 Exponential Leaky Integrate-and-Fire (ELIF) Model
78(1)
2.7.2 Two-Variable Models: The Adaptive Exponential Leaky Integrate-and-Fire (AELIF) Neuron
79(2)
2.7.3 Limitations of the LIF Formalism
81(1)
2.8 Tutorial 2.3: Models Based on Extensions of the LIF Neuron
81(5)
2.9 Appendix: Calculation of the Nernst Potential
86(3)
3 Analysis of Individual Spike Trains 89(44)
3.1 Responses of Single Neurons
89(11)
3.1.1 Receptive Fields
89(3)
3.1.2 Time-Varying Responses and the Peristimulus lime Histogram (PSTH)
92(1)
3.1.3 Neurons as Linear Filters and the Linear-Nonlinear Model
93(3)
3.1.4 Spike-Triggered Average
96(1)
3.1.5 White-Noise Stimuli for Receptive Field Generation
96(2)
3.1.6 Spatiotemporal Receptive Fields
98(2)
3.2 Tutorial 3.1: Generating Receptive Fields with Spike-Triggered Averages
100(4)
3.3 Spike-Train Statistics
104(6)
3.3.1 Coefficient of Variation (CV) of Interspike Intervals
105(2)
3.3.2 Fano Factor
107(1)
3.3.3 The Homogeneous Poisson Process: A Random Point Process for Artificial Spike Trains
108(1)
3.3.4 Comments on Analyses and Use of Dummy Data
109(1)
3.4 Tutorial 3.2: Statistical Properties of Simulated Spike Trains
110(3)
3.5 Receiver-Operating Characteristic (ROC)
113(8)
3.5.1 Producing the ROC Curve
113(2)
3.5.2 Optimal Position of the Threshold
115(3)
3.5.3 Uncovering the Underlying Distributions from Binary Responses: Recollection versus Familiarity
118(3)
3.6 Tutorial 3.3: Receiver-Operating Characteristic of a Noisy Neuron
121(2)
3.7 Appendix A: The Poisson Process
123(5)
3.7.1 The Poisson Distribution
123(2)
3.7.2 Expected Value of the Mean of a Poisson Process
125(1)
3.7.3 Fano Factor of the Poisson Process
125(1)
3.7.4 The Coefficient of Variation (CV) of the ISI Distribution of a Poisson Process
126(1)
3.7.5 Selecting from a Probability Distribution: Generating ISIS for the Poisson Process
127(1)
3.8 Appendix B: Stimulus Discriminability
128(5)
3.8.1 Optimal Value of Threshold
129(1)
3.8.2 Calculating the Probability of an Error
130(1)
3.8.3 Generating a Z-Score from a Probability
130(3)
4 Conductance-Based Models 133(40)
4.1 Introduction to the Hodgkin-Huxley Model
133(4)
4.1.1 Positive versus Negative Feedback
134(2)
4.1.2 Voltage Clamp versus Current Clamp
136(1)
4.2 Simulation of the Hodgkin-Huxley Model
137(10)
4.2.1 Two-State Systems
138(1)
4.2.2 Full Set of Dynamical Equations for the Hodgkin-Huxley Model
139(1)
4.2.3 Dynamical Behavior of the Hodgkin-Huxley Model: A Type-II Neuron
140(7)
4.3 Tutorial 4.1: The Hodgkin-Huxley Model as an Oscillator
147(3)
4.4 The Connor-Stevens Model: A Type-I Model
150(4)
4.5 Calcium Currents and Bursting
154(2)
4.5.1 Thalamic Rebound and the T-Type Calcium Channel
155(1)
4.6 Tutorial 4.2: Postinhibitory Rebound
156(3)
4.7 Modeling Multiple Compartments
159(7)
4.7.1 The Pinsky-Rinzel Model of an Intrinsic Burster
160(1)
4.7.2 Simulating the Pinsky-Rinzel Model
160(3)
4.7.3 A Note on Multicompartmental Modeling with Specific Conductances versus Absolute Conductances
163(3)
4.7.4 Model Complexity
166(1)
4.8 Hyperpolarization-Activated Currents (Ih) and Pacemaker Control
166(2)
4.9 Dendritic Computation
168(2)
4.10 Tutorial 4.3: A Two-Compartment Model of an Intrinsically Bursting Neuron
170(3)
5 Connections between Neurons 173(38)
5.1 The Synapse
173(6)
5.1.1 Electrical Synapses
173(1)
5.1.2 Chemical Synapses
174(5)
5.2 Modeling Synaptic Transmission through Chemical Synapses
179(3)
5.2.1 Spike-Induced Transmission
179(2)
5.2.2 Graded Release
181(1)
5.3 Dynamical Synapses
182(3)
5.3.1 Short-Term Synaptic Depression
183(1)
5.3.2 Short-Term Synaptic Facilitation
183(1)
5.3.3 Modeling Dynamical Synapses
184(1)
5.4 Tutorial 5.1: Synaptic Responses to Changes in Inputs
185(2)
5.5 The Connectivity Matrix
187(6)
5.5.1 General Types of Connectivity Matrices
189(1)
5.5.2 Cortical Connections: Sparseness and Structure
190(1)
5.5.3 Motifs
191(2)
5.6 Tutorial 5.2: Detecting Circuit Structure and Nonrandom Features within a Connectivity Matrix
193(3)
5.7 Oscillations and Multistability in Small Circuits
196(1)
5.8 Central Pattern Generators
197(6)
5.8.1 The Half-Center Oscillator
199(1)
5.8.2 The Triphasic Rhythm
199(1)
5.8.3 Phase Response Curves
200(3)
5.9 Tutorial 5.3: Bistability and Oscillations from Two LIF Neurons
203(2)
5.10 Appendix: Synaptic Input Produced by a Poisson Process
205(6)
5.10.1 Synaptic Saturation
205(3)
5.10.2 Synaptic Depression
208(1)
5.10.3 Synaptic Facilitation
209(1)
5.10.4 Notes on Combining Mechanisms
209(2)
6 Firing-Rate Models and Network Dynamics 211(46)
6.1 Firing-Rate Models
211(2)
6.2 Simulating a Firing-Rate Model
213(4)
6.2.1 Meaning of a Unit and Dale's Principle
216(1)
6.3 Recurrent Feedback and Bistability
217(10)
6.3.1 Bistability from Positive Feedback
217(4)
6.3.2 Limiting the Maximum Firing Rate Reached
221(1)
6.3.3 Dynamics of Synaptic Response
222(1)
6.3.4 Dynamics of Synaptic Depression and Facilitation
223(2)
6.3.5 Integration and Parametric Memory
225(2)
6.4 Tutorial 6.1: Bistability and Oscillations in a Firing-Rate Model with Feedback
227(2)
6.5 Decision-Making Circuits
229(7)
6.5.1 Decisions by Integration of Evidence
232(1)
6.5.2 Decision-Making Performance
233(2)
6.5.3 Decisions as State Transitions
235(1)
6.5.4 Biasing Decisions
235(1)
6.6 Tutorial 6.2: Dynamics of a Decision-Making Circuit in Two Modes of Operation
236(2)
6.7 Oscillations from Excitatory and Inhibitory Feedback
238(4)
6.8 Tutorial 6.3: Frequency of an Excitatory-Inhibitory Coupled Unit Oscillator and PING
242(3)
6.9 Orientation Selectivity and Contrast Invariance
245(5)
6.9.1 Ring Models
246(4)
6.10 Ring Attractors for Spatial Memory and Head Direction
250(4)
6.10.1 Dynamics of the Ring Attractor
252(2)
6.11 Tutorial 6.4: Orientation Selectivity in a Ring Model
254(3)
7 An Introduction to Dynamical Systems 257(36)
7.1 What Is a Dynamical System?
257(1)
7.2 Single Variable Behavior and Fixed Points
258(3)
7.2.1 Bifurcations
258(2)
7.2.2 Requirement for Oscillations
260(1)
7.3 Models with Two Variables
261(6)
7.3.1 Nullclines and Phase-Plane Analysis
262(2)
7.3.2 The Inhibition-Stabilized Network
264(3)
7.3.3 How Inhibitory Feedback to Inhibitory Neurons Impacts Stability of States
267(1)
7.4 Tutorial 7.1: The Inhibition-Stabilized Circuit
267(2)
7.5 Attractor State Itinerancy
269(2)
7.5.1 Bistable Percepts
269(1)
7.5.2 Noise-Driven Transitions in a Bistable System
270(1)
7.6 Quasistability and Relaxation Oscillators: The FitzHugh-Nagumo Model
271(4)
7.7 Heteroclinic Sequences
275(1)
7.8 Chaos
275(7)
7.8.1 Chaotic Systems and Lack of Predictability
277(2)
7.8.2 Examples of Chaotic Neural Circuits
279(3)
7.9 Criticality
282(6)
7.9.1 Power-Law Distributions
283(1)
7.9.2 Requirements for Criticality
284(3)
7.9.3 A Simplified Avalanche Model with a Subset of the Features of Criticality
287(1)
7.10 Tutorial 7.2: Diverse Dynamical Systems from Similar Circuit Architectures
288(2)
7.11 Appendix: Proof of the Scaling Relationship for Avalanche Sizes
290(3)
8 Learning and Synaptic Plasticity 293(46)
8.1 Hebbian Plasticity
293(4)
8.1.1 Modeling Hebbian Plasticity
296(1)
8.2 Tutorial 8.1: Pattern Completion and Pattern Separation via Hebbian Learning
297(3)
8.3 Spike-Timing Dependent Plasticity (STDP)
300(9)
8.3.1 Model of STDP
302(2)
8.3.2 Synaptic Competition via STDP
304(1)
8.3.3 Sequence Learning via STDP
305(1)
8.3.4 Triplet STDP
305(3)
8.3.5 A Note on Spike-Timing Dependent Plasticity
308(1)
8.3.6 Mechanisms of Spike-Timing Dependent Synaptic Plasticity
309(1)
8.4 More Detailed Empirical Models of Synaptic Plasticity
309(2)
8.5 Tutorial 8.2: Competition via STDP
311(2)
8.6 Homeostasis
313(6)
8.6.1 Firing-Rate Homeostasis
314(2)
8.6.2 Homeostasis of Synaptic Inputs
316(1)
8.6.3 Homeostasis of Intrinsic Properties
317(2)
8.7 Supervised Learning
319(7)
8.7.1 Conditioning
321(1)
8.7.2 Reward Prediction Errors and Reinforcement Learning
322(2)
8.7.3 The Weather-Prediction Task
324(1)
8.7.4 Calculations Required in the Weather-Prediction Task
325(1)
8.8 Tutorial 8.3: Learning the Weather-Prediction Task in a Neural Circuit
326(3)
8.9 Eyeblink Conditioning
329(2)
8.10 Tutorial 8.4: A Model of Eyeblink Conditioning
331(4)
8.11 Appendix A: Rate-Dependent Plasticity via STDP between Uncorrelated Poisson Spike Trains
335(1)
8.12 Appendix B: Rate-Dependence of Triplet STDP between Uncorrelated Poisson Spike Trains
336(3)
9 Analysis of Population Data 339(30)
9.1 Principal Component Analysis (PCA)
340(6)
9.1.1 PCA for Sorting of Spikes
341(1)
9.1.2 PCA for Analysis of Firing Rates
342(1)
9.1.3 PCA in Practice
342(3)
9.1.4 The Procedure of PCA
345(1)
9.2 Tutorial 9.1: Principal Component Analysis of Firing-Rate Trajectories
346(2)
9.3 Single-Trial versus Trial-Averaged Analyses
348(1)
9.4 Change-Point Detection
349(2)
9.4.1 Computational Note
351(1)
9.5 Hidden Markov Modeling (HMM)
351(4)
9.6 Tutorial 9.2: Change-Point Detection for a Poisson Process
355(2)
9.7 Decoding Position from Multiple Place Fields
357(5)
9.8 Appendix A: How PCA Works: Choosing a Direction to Maximize the Variance of the Projected Data
362(4)
9.8.1 Carrying out PCA without a Built-in Function
364(2)
9.9 Appendix B: Determining the Probability of Change Points for a Poisson Process
366(3)
9.9.1 Optimal Rate
366(1)
9.9.2 Evaluating the Change Point, Method 1
367(1)
9.9.3 Evaluating the Change Point, Method 2
367(2)
References 369(12)
Index 381