Atjaunināt sīkdatņu piekrišanu

Subspace Learning of Neural Networks [Hardback]

(Sichuan University, China), (Sichuan University, China), (Sichuan University, China)
  • Formāts: Hardback, 256 pages, height x width: 234x156 mm, weight: 500 g, 5 Tables, black and white; 84 Illustrations, black and white
  • Sērija : Automation and Control Engineering
  • Izdošanas datums: 29-Sep-2010
  • Izdevniecība: CRC Press Inc
  • ISBN-10: 1439815356
  • ISBN-13: 9781439815359
  • Hardback
  • Cena: 171,76 €
  • Grāmatu piegādes laiks ir 3-4 nedēļas, ja grāmata ir uz vietas izdevniecības noliktavā. Ja izdevējam nepieciešams publicēt jaunu tirāžu, grāmatas piegāde var aizkavēties.
  • Daudzums:
  • Ielikt grozā
  • Piegādes laiks - 4-6 nedēļas
  • Pievienot vēlmju sarakstam
  • Formāts: Hardback, 256 pages, height x width: 234x156 mm, weight: 500 g, 5 Tables, black and white; 84 Illustrations, black and white
  • Sērija : Automation and Control Engineering
  • Izdošanas datums: 29-Sep-2010
  • Izdevniecība: CRC Press Inc
  • ISBN-10: 1439815356
  • ISBN-13: 9781439815359
"Using real-life examples to illustrate the performance of learning algorithms and instructing readers how to apply them to practical applications, this work offers a comprehensive treatment of subspace learning algorithms for neural networks. The authors summarize a decade of high quality research offering a host of practical applications. They demonstrate ways to extend the use of algorithms to fields such as encryption communication, data mining, computer vision, and signal and image processing to name just a few. The brilliance of the work lies with how it coherently builds a theoretical understanding of the convergence behavior of subspace learning algorithms through a summary of chaotic behaviors"--

Provided by publisher.

1 Introduction
1(12)
1.1 Introduction
1(7)
1.1.1 Linear Neural Networks
1(2)
1.1.2 Subspace Learning Algorithms
3(1)
1.1.2.1 PCA Learning Algorithms
3(2)
1.1.2.2 MCA Learning Algorithms
5(1)
1.1.2.3 ICA Learning Algorithms
6(1)
1.1.3 Outline of This Book
7(1)
1.2 Methods for Convergence Analysis
8(2)
1.2.1 DCT Method
8(1)
1.2.2 DDT Method
9(1)
1.3 Relationship between SDT Algorithm and DDT Algorithm
10(1)
1.4 Some Notations and Preliminaries
11(1)
1.4.1 Covariance Matrix
11(1)
1.4.2 Simulation Data
11(1)
1.5 Conclusion
12(1)
2 PCA Learning Algorithms with Constants Learning Rates
13(38)
2.1 Introduction
13(1)
2.2 Preliminaries
14(1)
2.3 Oja's Algorithm
15(18)
2.3.1 Oja's Algorithm and the DCT Method
16(1)
2.3.2 DDT Formulation
16(1)
2.3.3 Invariant Sets and Convergence Results
17(10)
2.3.4 Simulation and Discussions
27(1)
2.3.4.1 Illustration of Invariant Sets
27(1)
2.3.4.2 Selection of Initial Vectors
28(1)
2.3.4.3 Selection of Learning Rate
29(4)
2.3.5 Conclusion
33(1)
2.4 Xu's LMSER Learning Algorithm
33(14)
2.4.1 Formulation and Preliminaries
33(1)
2.4.2 Invariant Set and Ultimate Bound
34(3)
2.4.3 Convergence Analysis
37(8)
2.4.4 Simulations and Discussions
45(2)
2.4.5 Conclusion
47(1)
2.5 Comparison of Oja's Algorithm and Xu's Algorithm
47(2)
2.6 Conclusion
49(2)
3 PCA Learning Algorithms with Adaptive Learning Rates
51(24)
3.1 Introduction
51(1)
3.2 Adaptive Learning Rate
51(2)
3.3 Oja's PCA Algorithms with Adaptive Learning Rate
53(2)
3.4 Convergence Analysis of Oja's Algorithms with Adaptive Learning Rate
55(10)
3.4.1 Boundedness
55(3)
3.4.2 Global Convergence
58(7)
3.5 Simulation and Discussion
65(9)
3.6 Conclusion
74(1)
4 GHA PCA Learning Algorithm
75(30)
4.1 Introduction
75(1)
4.2 Problem Formulation and Preliminaries
75(3)
4.3 Convergence Analysis
78(17)
4.3.1 Outline of Proof
78(1)
4.3.2 Case 1: ∫ = 1
79(5)
4.3.3 Case 2: 1 < ∫≤ p
84(11)
4.4 Simulation and Discussion
95(8)
4.4.1 Example 1
95(1)
4.4.2 Example 2
95(3)
4.4.3 Example 3
98(5)
4.5 Conclusion
103(2)
5 MCA Learning Algorithms
105(18)
5.1 Introduction
105(1)
5.2 A Stable MCA Algorithm
106(1)
5.3 Dynamical Analysis
107(11)
5.4 Simulation Results
118(3)
5.5 Conclusion
121(2)
6 ICA Learning Algorithms
123(30)
6.1 Introduction
123(1)
6.2 Preliminaries and Hyvarinen-Oja's Algorithms
124(2)
6.3 Convergence Analysis
126(12)
6.3.1 Invariant Sets
127(3)
6.3.2 DDT Algorithms and Local Convergence
130(8)
6.4 Extension of the DDT Algorithms
138(1)
6.5 Simulation and Discussion
139(13)
6.5.1 Example 1
139(3)
6.5.2 Example 2
142(1)
6.5.3 Example 3
142(10)
6.6 Conclusion
152(1)
7 Chaotic Behaviors Arising from Learning Algorithms
153(16)
7.1 Introduction
153(1)
7.2 Invariant Set and Divergence
154(7)
7.3 Stability Analysis
161(2)
7.4 Chaotic Behavior
163(4)
7.5 Conclusion
167(2)
8 Determination of the Number of Principal Directions in a Biologically Plausible PCA Model
169(16)
8.1 Introduction
169(1)
8.2 The PCA Model and Algorithm
170(4)
8.2.1 The PCA Model
170(2)
8.2.2 Algorithm Implementation
172(2)
8.3 Properties
174(3)
8.4 Simulations
177(7)
8.4.1 Example 1
177(1)
8.4.2 Example 2
178(6)
8.5 Conclusion
184(1)
9 Multi-Block-Based MCA for Nonlinear Surface Fitting
185(12)
9.1 Introduction
185(1)
9.2 MCA Method
186(2)
9.2.1 Matrix Algebraic Approaches
186(1)
9.2.2 Improved Oja +'s MCA Neural Network
186(1)
9.2.3 MCA Neural Network for Nonlinear Surface Fitting
187(1)
9.3 Multi-Block-Based MCA
188(1)
9.4 Simulation Results
189(6)
9.4.1 Simulation 1: Ellipsoid
189(3)
9.4.2 Simulation 2: Saddle
192(3)
9.5 Conclusion
195(2)
10 A ICA Algorithm for Extracting Fetal Electrocardiogram
197(8)
10.1 Introduction
197(1)
10.2 Problem Formulation
198(1)
10.3 The Proposed Algorithm
199(1)
10.4 Simulation
200(3)
10.5 Conclusion
203(2)
11 Rigid Medical Image Registration Using PCA Neural Network
205(8)
11.1 Introduction
205(1)
11.2 Method
206(4)
11.2.1 Feature Extraction
206(1)
11.2.2 Computing the Rotation Angle
207(2)
11.2.3 Computing Translations
209(1)
11.3 Simulations
210(1)
11.3.1 MR-MR Registration
210(1)
11.3.2 CT-MR Registration
210(1)
11.4 Conclusion
211(2)
Bibliography 213
Jian Cheng LV and Zhang Yi are affiliated with the Machine Intelligence Lab of the College of Computer Science at Sichuan University. Jiliu Zhou is affiliated with the College of Computer Science at Sichuan University.