Atjaunināt sīkdatņu piekrišanu

Information and Self-Organization: A Macroscopic Approach to Complex Systems 3rd enlarged ed. 2006 [Hardback]

4.33/5 (11 ratings by Goodreads)
  • Formāts: Hardback, 258 pages, height x width: 235x155 mm, weight: 1270 g, XIV, 258 p., 1 Hardback
  • Sērija : Springer Series in Synergetics
  • Izdošanas datums: 26-Jun-2006
  • Izdevniecība: Springer-Verlag Berlin and Heidelberg GmbH & Co. K
  • ISBN-10: 3540330216
  • ISBN-13: 9783540330219
Citas grāmatas par šo tēmu:
  • Hardback
  • Cena: 46,91 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Standarta cena: 55,19 €
  • Ietaupiet 15%
  • Grāmatu piegādes laiks ir 3-4 nedēļas, ja grāmata ir uz vietas izdevniecības noliktavā. Ja izdevējam nepieciešams publicēt jaunu tirāžu, grāmatas piegāde var aizkavēties.
  • Daudzums:
  • Ielikt grozā
  • Piegādes laiks - 4-6 nedēļas
  • Pievienot vēlmju sarakstam
  • Formāts: Hardback, 258 pages, height x width: 235x155 mm, weight: 1270 g, XIV, 258 p., 1 Hardback
  • Sērija : Springer Series in Synergetics
  • Izdošanas datums: 26-Jun-2006
  • Izdevniecība: Springer-Verlag Berlin and Heidelberg GmbH & Co. K
  • ISBN-10: 3540330216
  • ISBN-13: 9783540330219
Citas grāmatas par šo tēmu:
This book presents the concepts needed to deal with self-organizing complex systems from a unifying point of view that uses macroscopic data. The various meanings of the concept "information" are discussed and a general formulation of the maximum information (entropy) principle is used. With the aid of results from synergetics, adequate objective constraints for a large class of self-organizing systems are formulated and examples are given from physics, life and computer science. The relationship to chaos theory is examined and it is further shown that, based on possibly scarce and noisy data, unbiased guesses about processes of complex systems can be made and the underlying deterministic and random forces determined. This allows for probabilistic predictions of processes, with applications to numerous fields in science, technology, medicine and economics. The extensions of the third edition are essentially devoted to an introduction to the meaning of information in the quantum context. Indeed, quantum information science and technology is presently one of the most active fields of research at the interface of physics, technology and information sciences and has already established itself as one of the major future technologies for processing and communicating information on any scale. This book addresses graduate students and nonspecialist researchers wishing to get acquainted with the concept of information from a scientific perspective in more depth. It is suitable as a textbook for advanced courses or for self-study.

Recenzijas

From the reviews of the third edition:









"This enlarged edition of Information and Self-Organization addresses the concept of information in depth: ranging from Shannon information, from which all semantics has been exorcised, to the effects of information on receivers and the self-creation of meaningthat is, toward semantic information . Nevertheless, both the qualitative lessons and quantitative analysis presented in the book very useful for artificial life researchers." (Mikhail Prokopenko, Artificial Life, Vol. 15, 2009)

1. The Challenge of Complex Systems
1(35)
1.1 What Are Complex Systems?
1(4)
1.2 How to Deal with Complex Systems
5(2)
1.3 Model Systems
7(3)
1.4 Self-Organization
10(1)
1.5 Aiming at Universality
11(3)
1.5.1 Thermodynamics
11(1)
1.5.2 Statistical Physics
12(1)
1.5.3 Synergetics
13(1)
1.6 Information
14(19)
1.6.1 Shannon Information: Meaning Exorcised
15(1)
1.6.2 Effects of Information
16(7)
1.6.3 Self-Creation of Meaning
23(6)
1.6.4 How Much Information Do We Need to Maintain an Ordered State?
29(4)
1.7 The Second Foundation of Synergetics
33(3)
2. From the Microscopic to the Macroscopic World
36(17)
2.1 Levels of Description
36(1)
2.2 Langevin Equations
37(3)
2.3 Fokker-Planck Equation
40(1)
2.4 Exact Stationary Solution of the Fokker-Planck Equation for Systems in Detailed Balance
41(3)
2.4.1 Detailed Balance
41(1)
2.4.2 The Required Structure of the Fokker-Planck Equation and Its Stationary Solution
42(2)
2.5 Path Integrals
44(1)
2.6 Reduction of Complexity, Order Parameters and the Slaving Principle
45(4)
2.6.1 Linear Stability Analysis
46(1)
2.6.2 Transformation of Evolution Equations
47(1)
2.6.3 The Slaving Principle
48(1)
2.7 Nonequilibrium Phase Transitions
49(2)
2.8 Pattern Formation
51(2)
3. ...and Back Again: The Maximum Information Principle (MIP)
53(12)
3.1 Some Basic Ideas
53(4)
3.2 Information Gain
57(1)
3.3 Information Entropy and Constraints
58(5)
3.4 Continuous Variables
63(2)
4. An Example from Physics: Thermodynamics
65(4)
5. Application of the Maximum Information Principle to Self-Organizing Systems
69(5)
5.1 Introduction
69(1)
5.2 Application to Self-Organizing Systems: Single Mode Laser
69(2)
5.3 Multimode Laser Without Phase Relations
71(1)
5.4 Processes Periodic in Order Parameters
72(2)
6. The Maximum Information Principle for Nonequilibrium Phase Transitions: Determination of Order Parameters, Enslaved Modes, and Emerging Patterns
74(7)
6.1 Introduction
74(1)
6.2 General Approach
74(2)
6.3 Determination of Order Parameters, Enslaved Modes, and Emerging Patterns
76(1)
6.4 Approximations
77(1)
6.5 Spatial Patterns
78(1)
6.6 Relation to the Landau Theory of Phase Transitions. Guessing of Fokker-Planck Equations
79(2)
7. Information, Information Gain, and Efficiency of Self-Organizing Systems Close to Their Instability Points
81(34)
7.1 Introduction
81(1)
7.2 The Slaving Principle and Its Application to Information
82(1)
7.3 Information Gain
82(1)
7.4 An Example: Nonequilibrium Phase Transitions
83(1)
7.5 Soft Single-Mode Instabilities
84(1)
7.6 Can We Measure the Information and the Information Gain?
85(2)
7.6.1 Efficiency
85(1)
7.6.2 Information and Information Gain
86(1)
7.7 Several Order Parameters
87(1)
7.8 Explicit Calculation of the Information of a Single Order Parameter
88(7)
7.8.1 The Region Well Below Threshold
89(1)
7.8.2 The Region Well Above Threshold
90(3)
7.8.3 Numerical Results
93(1)
7.8.4 Discussion
94(1)
7.9 Exact Analytical Results on Information, Information Gain, and Efficiency of a Single Order Parameter
95(7)
7.9.1 The Instability Point
97(1)
7.9.2 The Approach to Instability
98(1)
7.9.3 The Stable Region
99(1)
7.9.4 The Injected Signal
100(1)
7.9.5 Conclusions
101(1)
7.10 The S-Theorem of Klimontovich
102(5)
7.10.1 Region 1: Below Laser Threshold
104(1)
7.10.2 Region 2: At Threshold
104(1)
7.10.3 Region 3: Well Above Threshold
105(2)
7.11 The Contribution of the Enslaved Modes to the Information Close to Nonequilibrium Phase Transitions
107(8)
8. Direct Determination of Lagrange Multipliers
115(10)
8.1 Information Entropy of Systems Below and Above Their Critical Point
115(2)
8.2 Direct Determination of Lagrange Multipliers Below, At and Above the Critical Point
117(8)
9. Unbiased Modeling of Stochastic Processes: How to Guess Path Integrals, Fokker-Planck Equations and Langevin-Ito Equations
125(10)
9.1 One-Dimensional State Vector
125(2)
9.2 Generalization to a Multidimensional State Vector
127(3)
9.3 Correlation Functions as Constraints
130(2)
9.4 The Fokker-Planck Equation Belonging to the Short-Time Propagator
132(1)
9.5 Can We Derive Newton's Law from Experimental Data?
133(2)
10. Application to Some Physical Systems 135(5)
10.1 Multimode Lasers with Phase Relations
135(1)
10.2 The Single-Mode Laser Including Polarization and Inversion
136(2)
10.3 Fluid Dynamics: The Convection Instability
138(2)
11. Transitions Between Behavioral Patterns in Biology, An Example: Hand Movements 140(13)
11.1 Some Experimental Facts
140(1)
11.2 How to Model the Transition
141(6)
11.3 Critical Fluctuations
147(4)
11.4 Some Conclusions
151(2)
12. Pattern Recognition. Unbiased Guesses of Processes: Explicit Determination of Lagrange Multipliers 153(42)
12.1 Feature Selection
153(6)
12.2 An Algorithm for Pattern Recognition
159(2)
12.3 The Basic Construction Principle of a Synergetic Computer
161(2)
12.4 Learning by Means of the Information Gain
163(2)
12.5 Processes and Associative Action
165(4)
12.6 Explicit Determination of the Lagrange Multipliers of the Conditional Probability. General Approach for Discrete and Continuous Processes
169(5)
12.7 Approximation and Smoothing Schemes. Additive Noise
174(7)
12.8 An Explicit Example: Brownian Motion
181(3)
12.9 Approximation and Smoothing Schemes. Multiplicative (and Additive) Noise
184(1)
12.10 Explicit Calculation of Drift and Diffusion Coefficients. Examples
185(2)
12.11 Process Modelling, Prediction and Control, Robotics
187(2)
12.12 Non-Markovian Processes. Connection with Chaos Theory
189(6)
12.12.1 Checking the Markov Property
189(1)
12.12.2 Time Series Analysis
190(5)
13. Information Compression in Cognition: The Interplay between Shannon and Semantic Information 195(8)
13.1 Information Compression: A General Formula
195(2)
13.2 Pattern Recognition as Information Compression: Use of Symmetries
197(2)
13.3 Deformations
199(2)
13.4 Reinterpretation of the Results of Sects. 13.1-13.3
201(2)
14. Quantum Systems 203(13)
14.1 Why Quantum Theory of Information?
203(2)
14.2 The Maximum Information Principle
205(6)
14.3 Order Parameters, Enslaved Modes and Patterns
211(3)
14.4 Information of Order Parameters and Enslaved Modes
214(2)
15. Quantum Information 216(6)
15.1 Basic Concepts of Quantum Information. Q-bits
216(2)
15.2 Phase and Decoherence
218(1)
15.3 Representation of Numbers
219(1)
15.4 Register
220(1)
15.5 Entanglement
221(1)
16. Quantum Computation 222(20)
16.1 Classical Gates
222(1)
16.2 Quantum Gates
223(4)
16.3 Calculation of the Period of a Sequence by a Quantum Computer
227(502)
16.4 Coding, Decoding and Breaking Codes
729
16.4.1 A Little Mathematics
230(1)
16.4.2 RSA Coding and Decoding
230(1)
16.4.3 Shor's Approach, Continued
231(2)
16.5 The Physics of Spin 1/2
233(2)
16.6 Quantum Theory of a Spin in Mutually Perpendicular Magnetic Fields, One Constant and One Time Dependent
235(6)
16.7 Quantum Computation and Self-Organization
241(1)
17. Concluding Remarks and Outlook 242(2)
References 244(7)
Subject Index 251