|
1. The Challenge of Complex Systems |
|
|
1 | (35) |
|
1.1 What Are Complex Systems? |
|
|
1 | (4) |
|
1.2 How to Deal with Complex Systems |
|
|
5 | (2) |
|
|
7 | (3) |
|
|
10 | (1) |
|
1.5 Aiming at Universality |
|
|
11 | (3) |
|
|
11 | (1) |
|
1.5.2 Statistical Physics |
|
|
12 | (1) |
|
|
13 | (1) |
|
|
14 | (19) |
|
1.6.1 Shannon Information: Meaning Exorcised |
|
|
15 | (1) |
|
1.6.2 Effects of Information |
|
|
16 | (7) |
|
1.6.3 Self-Creation of Meaning |
|
|
23 | (6) |
|
1.6.4 How Much Information Do We Need to Maintain an Ordered State? |
|
|
29 | (4) |
|
1.7 The Second Foundation of Synergetics |
|
|
33 | (3) |
|
2. From the Microscopic to the Macroscopic World |
|
|
36 | (17) |
|
2.1 Levels of Description |
|
|
36 | (1) |
|
|
37 | (3) |
|
2.3 Fokker-Planck Equation |
|
|
40 | (1) |
|
2.4 Exact Stationary Solution of the Fokker-Planck Equation for Systems in Detailed Balance |
|
|
41 | (3) |
|
|
41 | (1) |
|
2.4.2 The Required Structure of the Fokker-Planck Equation and Its Stationary Solution |
|
|
42 | (2) |
|
|
44 | (1) |
|
2.6 Reduction of Complexity, Order Parameters and the Slaving Principle |
|
|
45 | (4) |
|
2.6.1 Linear Stability Analysis |
|
|
46 | (1) |
|
2.6.2 Transformation of Evolution Equations |
|
|
47 | (1) |
|
2.6.3 The Slaving Principle |
|
|
48 | (1) |
|
2.7 Nonequilibrium Phase Transitions |
|
|
49 | (2) |
|
|
51 | (2) |
|
3. ...and Back Again: The Maximum Information Principle (MIP) |
|
|
53 | (12) |
|
|
53 | (4) |
|
|
57 | (1) |
|
3.3 Information Entropy and Constraints |
|
|
58 | (5) |
|
|
63 | (2) |
|
4. An Example from Physics: Thermodynamics |
|
|
65 | (4) |
|
5. Application of the Maximum Information Principle to Self-Organizing Systems |
|
|
69 | (5) |
|
|
69 | (1) |
|
5.2 Application to Self-Organizing Systems: Single Mode Laser |
|
|
69 | (2) |
|
5.3 Multimode Laser Without Phase Relations |
|
|
71 | (1) |
|
5.4 Processes Periodic in Order Parameters |
|
|
72 | (2) |
|
6. The Maximum Information Principle for Nonequilibrium Phase Transitions: Determination of Order Parameters, Enslaved Modes, and Emerging Patterns |
|
|
74 | (7) |
|
|
74 | (1) |
|
|
74 | (2) |
|
6.3 Determination of Order Parameters, Enslaved Modes, and Emerging Patterns |
|
|
76 | (1) |
|
|
77 | (1) |
|
|
78 | (1) |
|
6.6 Relation to the Landau Theory of Phase Transitions. Guessing of Fokker-Planck Equations |
|
|
79 | (2) |
|
7. Information, Information Gain, and Efficiency of Self-Organizing Systems Close to Their Instability Points |
|
|
81 | (34) |
|
|
81 | (1) |
|
7.2 The Slaving Principle and Its Application to Information |
|
|
82 | (1) |
|
|
82 | (1) |
|
7.4 An Example: Nonequilibrium Phase Transitions |
|
|
83 | (1) |
|
7.5 Soft Single-Mode Instabilities |
|
|
84 | (1) |
|
7.6 Can We Measure the Information and the Information Gain? |
|
|
85 | (2) |
|
|
85 | (1) |
|
7.6.2 Information and Information Gain |
|
|
86 | (1) |
|
7.7 Several Order Parameters |
|
|
87 | (1) |
|
7.8 Explicit Calculation of the Information of a Single Order Parameter |
|
|
88 | (7) |
|
7.8.1 The Region Well Below Threshold |
|
|
89 | (1) |
|
7.8.2 The Region Well Above Threshold |
|
|
90 | (3) |
|
|
93 | (1) |
|
|
94 | (1) |
|
7.9 Exact Analytical Results on Information, Information Gain, and Efficiency of a Single Order Parameter |
|
|
95 | (7) |
|
7.9.1 The Instability Point |
|
|
97 | (1) |
|
7.9.2 The Approach to Instability |
|
|
98 | (1) |
|
|
99 | (1) |
|
7.9.4 The Injected Signal |
|
|
100 | (1) |
|
|
101 | (1) |
|
7.10 The S-Theorem of Klimontovich |
|
|
102 | (5) |
|
7.10.1 Region 1: Below Laser Threshold |
|
|
104 | (1) |
|
7.10.2 Region 2: At Threshold |
|
|
104 | (1) |
|
7.10.3 Region 3: Well Above Threshold |
|
|
105 | (2) |
|
7.11 The Contribution of the Enslaved Modes to the Information Close to Nonequilibrium Phase Transitions |
|
|
107 | (8) |
|
8. Direct Determination of Lagrange Multipliers |
|
|
115 | (10) |
|
8.1 Information Entropy of Systems Below and Above Their Critical Point |
|
|
115 | (2) |
|
8.2 Direct Determination of Lagrange Multipliers Below, At and Above the Critical Point |
|
|
117 | (8) |
|
9. Unbiased Modeling of Stochastic Processes: How to Guess Path Integrals, Fokker-Planck Equations and Langevin-Ito Equations |
|
|
125 | (10) |
|
9.1 One-Dimensional State Vector |
|
|
125 | (2) |
|
9.2 Generalization to a Multidimensional State Vector |
|
|
127 | (3) |
|
9.3 Correlation Functions as Constraints |
|
|
130 | (2) |
|
9.4 The Fokker-Planck Equation Belonging to the Short-Time Propagator |
|
|
132 | (1) |
|
9.5 Can We Derive Newton's Law from Experimental Data? |
|
|
133 | (2) |
10. Application to Some Physical Systems |
|
135 | (5) |
|
10.1 Multimode Lasers with Phase Relations |
|
|
135 | (1) |
|
10.2 The Single-Mode Laser Including Polarization and Inversion |
|
|
136 | (2) |
|
10.3 Fluid Dynamics: The Convection Instability |
|
|
138 | (2) |
11. Transitions Between Behavioral Patterns in Biology, An Example: Hand Movements |
|
140 | (13) |
|
11.1 Some Experimental Facts |
|
|
140 | (1) |
|
11.2 How to Model the Transition |
|
|
141 | (6) |
|
11.3 Critical Fluctuations |
|
|
147 | (4) |
|
|
151 | (2) |
12. Pattern Recognition. Unbiased Guesses of Processes: Explicit Determination of Lagrange Multipliers |
|
153 | (42) |
|
|
153 | (6) |
|
12.2 An Algorithm for Pattern Recognition |
|
|
159 | (2) |
|
12.3 The Basic Construction Principle of a Synergetic Computer |
|
|
161 | (2) |
|
12.4 Learning by Means of the Information Gain |
|
|
163 | (2) |
|
12.5 Processes and Associative Action |
|
|
165 | (4) |
|
12.6 Explicit Determination of the Lagrange Multipliers of the Conditional Probability. General Approach for Discrete and Continuous Processes |
|
|
169 | (5) |
|
12.7 Approximation and Smoothing Schemes. Additive Noise |
|
|
174 | (7) |
|
12.8 An Explicit Example: Brownian Motion |
|
|
181 | (3) |
|
12.9 Approximation and Smoothing Schemes. Multiplicative (and Additive) Noise |
|
|
184 | (1) |
|
12.10 Explicit Calculation of Drift and Diffusion Coefficients. Examples |
|
|
185 | (2) |
|
12.11 Process Modelling, Prediction and Control, Robotics |
|
|
187 | (2) |
|
12.12 Non-Markovian Processes. Connection with Chaos Theory |
|
|
189 | (6) |
|
12.12.1 Checking the Markov Property |
|
|
189 | (1) |
|
12.12.2 Time Series Analysis |
|
|
190 | (5) |
13. Information Compression in Cognition: The Interplay between Shannon and Semantic Information |
|
195 | (8) |
|
13.1 Information Compression: A General Formula |
|
|
195 | (2) |
|
13.2 Pattern Recognition as Information Compression: Use of Symmetries |
|
|
197 | (2) |
|
|
199 | (2) |
|
13.4 Reinterpretation of the Results of Sects. 13.1-13.3 |
|
|
201 | (2) |
14. Quantum Systems |
|
203 | (13) |
|
14.1 Why Quantum Theory of Information? |
|
|
203 | (2) |
|
14.2 The Maximum Information Principle |
|
|
205 | (6) |
|
14.3 Order Parameters, Enslaved Modes and Patterns |
|
|
211 | (3) |
|
14.4 Information of Order Parameters and Enslaved Modes |
|
|
214 | (2) |
15. Quantum Information |
|
216 | (6) |
|
15.1 Basic Concepts of Quantum Information. Q-bits |
|
|
216 | (2) |
|
15.2 Phase and Decoherence |
|
|
218 | (1) |
|
15.3 Representation of Numbers |
|
|
219 | (1) |
|
|
220 | (1) |
|
|
221 | (1) |
16. Quantum Computation |
|
222 | (20) |
|
|
222 | (1) |
|
|
223 | (4) |
|
16.3 Calculation of the Period of a Sequence by a Quantum Computer |
|
|
227 | (502) |
|
16.4 Coding, Decoding and Breaking Codes |
|
|
729 | |
|
16.4.1 A Little Mathematics |
|
|
230 | (1) |
|
16.4.2 RSA Coding and Decoding |
|
|
230 | (1) |
|
16.4.3 Shor's Approach, Continued |
|
|
231 | (2) |
|
16.5 The Physics of Spin 1/2 |
|
|
233 | (2) |
|
16.6 Quantum Theory of a Spin in Mutually Perpendicular Magnetic Fields, One Constant and One Time Dependent |
|
|
235 | (6) |
|
16.7 Quantum Computation and Self-Organization |
|
|
241 | (1) |
17. Concluding Remarks and Outlook |
|
242 | (2) |
References |
|
244 | (7) |
Subject Index |
|
251 | |