|
1 From Logic to Cognitive Science |
|
|
1 | (16) |
|
1.1 The Beginnings of Artificial Neural Networks |
|
|
1 | (4) |
|
|
5 | (3) |
|
1.3 From Cognitive Science to Deep Learning |
|
|
8 | (3) |
|
1.4 Neural Networks in the General AI Landscape |
|
|
11 | (1) |
|
1.5 Philosophical and Cognitive Aspects |
|
|
12 | (5) |
|
|
15 | (2) |
|
2 Mathematical and Computational Prerequisites |
|
|
17 | (34) |
|
2.1 Derivations and Function Minimization |
|
|
17 | (8) |
|
2.2 Vectors, Matrices and Linear Programming |
|
|
25 | (7) |
|
2.3 Probability Distributions |
|
|
32 | (7) |
|
2.4 Logic and Turing Machines |
|
|
39 | (2) |
|
|
41 | (2) |
|
2.6 A Brief Overview of Python Programming |
|
|
43 | (8) |
|
|
49 | (2) |
|
3 Machine Learning Basics |
|
|
51 | (28) |
|
3.1 Elementary Classification Problem |
|
|
51 | (6) |
|
3.2 Evaluating Classification Results |
|
|
57 | (2) |
|
3.3 A Simple Classifier: Naive Bayes |
|
|
59 | (2) |
|
3.4 A Simple Neural Network: Logistic Regression |
|
|
61 | (7) |
|
3.5 Introducing the MNIST Dataset |
|
|
68 | (2) |
|
3.6 Learning Without Labels: K-Means |
|
|
70 | (2) |
|
3.7 Learning Different Representations: PCA |
|
|
72 | (3) |
|
3.8 Learning Language: The Bag of Words Representation |
|
|
75 | (4) |
|
|
77 | (2) |
|
4 Feedforward Neural Networks |
|
|
79 | (28) |
|
4.1 Basic Concepts and Terminology for Neural Networks |
|
|
79 | (3) |
|
4.2 Representing Network Components with Vectors and Matrices |
|
|
82 | (2) |
|
|
84 | (3) |
|
|
87 | (2) |
|
4.5 From the Logistic Neuron to Backpropagation |
|
|
89 | (4) |
|
|
93 | (9) |
|
4.7 A Complete Feedforward Neural Network |
|
|
102 | (5) |
|
|
105 | (2) |
|
5 Modifications and Extensions to a Feed-Forward Neural Network |
|
|
107 | (14) |
|
5.1 The Idea of Regularization |
|
|
107 | (2) |
|
5.2 L1 and L2 Regularization |
|
|
109 | (2) |
|
5.3 Learning Rate, Momentum and Dropout |
|
|
111 | (5) |
|
5.4 Stochastic Gradient Descent and Online Learning |
|
|
116 | (2) |
|
5.5 Problems for Multiple Hidden Layers: Vanishing and Exploding Gradients |
|
|
118 | (3) |
|
|
119 | (2) |
|
6 Convolutional Neural Networks |
|
|
121 | (14) |
|
6.1 A Third Visit to Logistic Regression |
|
|
121 | (4) |
|
6.2 Feature Maps and Pooling |
|
|
125 | (2) |
|
6.3 A Complete Convolutional Network |
|
|
127 | (3) |
|
6.4 Using a Convolutional Network to Classify Text |
|
|
130 | (5) |
|
|
132 | (3) |
|
7 Recurrent Neural Networks |
|
|
135 | (18) |
|
7.1 Sequences of Unequal Length |
|
|
135 | (1) |
|
7.2 The Three Settings of Learning with Recurrent Neural Networks |
|
|
136 | (3) |
|
7.3 Adding Feedback Loops and Unfolding a Neural Network |
|
|
139 | (1) |
|
|
140 | (2) |
|
7.5 Long Short-Term Memory |
|
|
142 | (3) |
|
7.6 Using a Recurrent Neural Network for Predicting Following Words |
|
|
145 | (8) |
|
|
152 | (1) |
|
|
153 | (12) |
|
8.1 Learning Representations |
|
|
153 | (3) |
|
8.2 Different Autoencoder Architectures |
|
|
156 | (2) |
|
8.3 Stacking Autoencoders |
|
|
158 | (3) |
|
8.4 Recreating the Cat Paper |
|
|
161 | (4) |
|
|
163 | (2) |
|
|
165 | (10) |
|
9.1 Word Embeddings and Word Analogies |
|
|
165 | (1) |
|
|
166 | (2) |
|
|
168 | (3) |
|
9.4 Walking Through the Word-Space: An Idea That Has Eluded Symbolic AI |
|
|
171 | (4) |
|
|
173 | (2) |
|
10 An Overview of Different Neural Network Architectures |
|
|
175 | (10) |
|
|
175 | (3) |
|
|
178 | (3) |
|
10.3 The Kernel of General Connectionist Intelligence: The bAbI Dataset |
|
|
181 | (4) |
|
|
182 | (3) |
|
|
185 | (4) |
|
11.1 An Incomplete Overview of Open Research Questions |
|
|
185 | (1) |
|
11.2 The Spirit of Connectionism and Philosophical Ties |
|
|
186 | (3) |
|
|
187 | (2) |
Index |
|
189 | |