|
1 Getting Started with Scientific Python |
|
|
1 | (38) |
|
1.1 Installation and Setup |
|
|
2 | (2) |
|
|
4 | (13) |
|
1.2.1 Numpy Arrays and Memory |
|
|
6 | (3) |
|
|
9 | (1) |
|
|
10 | (3) |
|
1.2.4 Numpy Masked Arrays |
|
|
13 | (1) |
|
1.2.5 Floating-Point Numbers |
|
|
13 | (4) |
|
1.2.6 Numpy Optimizations and Prospectus |
|
|
17 | (1) |
|
|
17 | (3) |
|
1.3.1 Alternatives to Matplotlib |
|
|
19 | (1) |
|
1.3.2 Extensions to Matplotlib |
|
|
20 | (1) |
|
|
20 | (2) |
|
|
22 | (2) |
|
|
24 | (1) |
|
|
25 | (5) |
|
|
25 | (2) |
|
|
27 | (3) |
|
|
30 | (2) |
|
1.9 Interfacing with Compiled Libraries |
|
|
32 | (1) |
|
1.10 Integrated Development Environments |
|
|
33 | (1) |
|
1.11 Quick Guide to Performance and Parallel Programming |
|
|
34 | (3) |
|
|
37 | (2) |
|
|
38 | (1) |
|
|
39 | (84) |
|
|
39 | (16) |
|
2.1.1 Understanding Probability Density |
|
|
40 | (1) |
|
|
41 | (5) |
|
2.1.3 Continuous Random Variables |
|
|
46 | (3) |
|
2.1.4 Transformation of Variables Beyond Calculus |
|
|
49 | (2) |
|
2.1.5 Independent Random Variables |
|
|
51 | (2) |
|
2.1.6 Classic Broken Rod Example |
|
|
53 | (2) |
|
|
55 | (3) |
|
|
57 | (1) |
|
2.3 Conditional Expectation as Projection |
|
|
58 | (7) |
|
|
64 | (1) |
|
2.4 Conditional Expectation and Mean Squared Error |
|
|
65 | (3) |
|
2.5 Worked Examples of Conditional Expectation and Mean Square Error Optimization |
|
|
68 | (15) |
|
|
69 | (3) |
|
|
72 | (3) |
|
|
75 | (3) |
|
|
78 | (1) |
|
|
79 | (3) |
|
|
82 | (1) |
|
|
83 | (12) |
|
2.6.1 Normal Distribution |
|
|
83 | (1) |
|
2.6.2 Multinomial Distribution |
|
|
84 | (2) |
|
2.6.3 Chi-square Distribution |
|
|
86 | (3) |
|
2.6.4 Poisson and Exponential Distributions |
|
|
89 | (1) |
|
|
90 | (1) |
|
|
91 | (2) |
|
2.6.7 Dirichlet-Multinomial Distribution |
|
|
93 | (2) |
|
|
95 | (6) |
|
2.7.1 Information Theory Concepts |
|
|
96 | (2) |
|
2.7.2 Properties of Information Entropy |
|
|
98 | (1) |
|
2.7.3 Kullback--Leibler Divergence |
|
|
99 | (1) |
|
2.7.4 Cross-Entropy as Maximum Likelihood |
|
|
100 | (1) |
|
2.8 Moment Generating Functions |
|
|
101 | (3) |
|
2.9 Monte Carlo Sampling Methods |
|
|
104 | (9) |
|
2.9.1 Inverse CDF Method for Discrete Variables |
|
|
105 | (2) |
|
2.9.2 Inverse CDF Method for Continuous Variables |
|
|
107 | (1) |
|
|
108 | (5) |
|
2.10 Sampling Importance Resampling |
|
|
113 | (2) |
|
|
115 | (8) |
|
2.11.1 Markov's Inequality |
|
|
115 | (1) |
|
2.11.2 Chebyshev's Inequality |
|
|
116 | (2) |
|
2.11.3 Hoeffding's Inequality |
|
|
118 | (2) |
|
|
120 | (3) |
|
|
123 | (114) |
|
|
123 | (1) |
|
3.2 Python Modules for Statistics |
|
|
124 | (2) |
|
3.2.1 Scipy Statistics Module |
|
|
124 | (1) |
|
3.2.2 Sympy Statistics Module |
|
|
125 | (1) |
|
3.2.3 Other Python Modules for Statistics |
|
|
126 | (1) |
|
|
126 | (7) |
|
3.3.1 Almost Sure Convergence |
|
|
126 | (3) |
|
3.3.2 Convergence in Probability |
|
|
129 | (2) |
|
3.3.3 Convergence in Distribution |
|
|
131 | (1) |
|
|
132 | (1) |
|
3.4 Estimation Using Maximum Likelihood |
|
|
133 | (14) |
|
3.4.1 Setting Up the Coin-Flipping Experiment |
|
|
135 | (10) |
|
|
145 | (2) |
|
3.5 Hypothesis Testing and P-Values |
|
|
147 | (19) |
|
3.5.1 Back to the Coin-Flipping Example |
|
|
149 | (3) |
|
3.5.2 Receiver Operating Characteristic |
|
|
152 | (2) |
|
|
154 | (1) |
|
|
155 | (8) |
|
3.5.5 Testing Multiple Hypotheses |
|
|
163 | (1) |
|
|
163 | (3) |
|
|
166 | (3) |
|
|
169 | (14) |
|
3.7.1 Extensions to Multiple Covariates |
|
|
178 | (5) |
|
|
183 | (5) |
|
|
188 | (7) |
|
|
195 | (6) |
|
3.10.1 Parametric Bootstrap |
|
|
200 | (1) |
|
|
201 | (4) |
|
3.12 Nonparametric Methods |
|
|
205 | (23) |
|
3.12.1 Kernel Density Estimation |
|
|
205 | (2) |
|
|
207 | (6) |
|
3.12.3 Nonparametric Regression Estimators |
|
|
213 | (1) |
|
3.12.4 Nearest Neighbors Regression |
|
|
214 | (4) |
|
|
218 | (1) |
|
3.12.6 Curse of Dimensionality |
|
|
219 | (2) |
|
3.12.7 Nonparametric Tests |
|
|
221 | (7) |
|
|
228 | (9) |
|
|
231 | (5) |
|
|
236 | (1) |
|
|
237 | (144) |
|
|
237 | (1) |
|
4.2 Python Machine Learning Modules |
|
|
237 | (4) |
|
|
241 | (27) |
|
4.3.1 Introduction to Theory of Machine Learning |
|
|
244 | (5) |
|
4.3.2 Theory of Generalization |
|
|
249 | (1) |
|
4.3.3 Worked Example for Generalization/Approximation Complexity |
|
|
250 | (6) |
|
|
256 | (4) |
|
|
260 | (5) |
|
|
265 | (3) |
|
|
268 | (13) |
|
|
275 | (2) |
|
|
277 | (4) |
|
|
281 | (4) |
|
|
281 | (4) |
|
|
285 | (10) |
|
4.7 Generalized Linear Models |
|
|
295 | (5) |
|
|
300 | (11) |
|
|
304 | (5) |
|
|
309 | (2) |
|
4.9 Support Vector Machines |
|
|
311 | (6) |
|
|
315 | (2) |
|
4.10 Dimensionality Reduction |
|
|
317 | (8) |
|
4.10.1 Independent Component Analysis |
|
|
321 | (4) |
|
|
325 | (4) |
|
|
329 | (5) |
|
|
329 | (2) |
|
|
331 | (3) |
|
|
334 | (47) |
|
4.13.1 Introduction to Tensorflow |
|
|
343 | (7) |
|
4.13.2 Understanding Gradient Descent |
|
|
350 | (13) |
|
4.13.3 Image Processing Using Convolutional Neural Networks |
|
|
363 | (16) |
|
|
379 | (2) |
Notation |
|
381 | (2) |
Index |
|
383 | |