Atjaunināt sīkdatņu piekrišanu

Robust Recognition via Information Theoretic Learning 2014 ed. [Mīkstie vāki]

  • Formāts: Paperback / softback, 110 pages, height x width: 235x155 mm, weight: 454 g, 25 Illustrations, color; 4 Illustrations, black and white; XI, 110 p. 29 illus., 25 illus. in color., 1 Paperback / softback
  • Sērija : SpringerBriefs in Computer Science
  • Izdošanas datums: 09-Sep-2014
  • Izdevniecība: Springer International Publishing AG
  • ISBN-10: 3319074156
  • ISBN-13: 9783319074153
Citas grāmatas par šo tēmu:
  • Mīkstie vāki
  • Cena: 46,91 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Standarta cena: 55,19 €
  • Ietaupiet 15%
  • Grāmatu piegādes laiks ir 3-4 nedēļas, ja grāmata ir uz vietas izdevniecības noliktavā. Ja izdevējam nepieciešams publicēt jaunu tirāžu, grāmatas piegāde var aizkavēties.
  • Daudzums:
  • Ielikt grozā
  • Piegādes laiks - 4-6 nedēļas
  • Pievienot vēlmju sarakstam
  • Formāts: Paperback / softback, 110 pages, height x width: 235x155 mm, weight: 454 g, 25 Illustrations, color; 4 Illustrations, black and white; XI, 110 p. 29 illus., 25 illus. in color., 1 Paperback / softback
  • Sērija : SpringerBriefs in Computer Science
  • Izdošanas datums: 09-Sep-2014
  • Izdevniecība: Springer International Publishing AG
  • ISBN-10: 3319074156
  • ISBN-13: 9783319074153
Citas grāmatas par šo tēmu:
This Springer Brief represents a comprehensive review of information theoretic methods for robust recognition. A variety of information theoretic methods have been proffered in the past decade, in a large variety of computer vision applications; this work brings them together, attempts to impart the theory, optimization and usage of information entropy.The authors resort to a new information theoretic concept, correntropy, as a robust measure and apply it to solve robust face recognition and object recognition problems. For computational efficiency, the brief introduces the additive and multiplicative forms of half-quadratic optimization to efficiently minimize entropy problems and a two-stage sparse presentation framework for large scale recognition problems. It also describes the strengths and deficiencies of different robust measures in solving robust recognition problems.

Introduction.- M-estimators and Half-quadratic Minimization.- Information Measures.- Correntropy and Linear Representation.- 1 Regularized Correntropy.- Correntropy with Nonnegative Constraint.
1 Introduction
1(2)
1.1 Outline
2(1)
2 M-Estimators and Half-Quadratic Minimization
3(10)
2.1 M-Estimation
3(1)
2.2 Half-Quadratic Minimization
4(7)
2.2.1 Iterative Minimization
5(1)
2.2.2 The Additive and Multiplicative Forms
6(5)
2.3 Phase Transition Diagrams
11(1)
2.4 Summary
11(2)
3 Information Measures
13(32)
3.1 Shannon Entropy
13(5)
3.2 Renyi's Quadratic Entropy
18(12)
3.2.1 Robust PCA
18(6)
3.2.2 Robust Discriminant Analysis
24(6)
3.3 Normalized Information Measures and Classification Evaluations
30(14)
3.3.1 Confusion Matrix in Abstaining Classifications
32(1)
3.3.2 Meta-measures in Classification Evaluations
33(1)
3.3.3 Normalized Information (NI) Measures
34(10)
3.4 Summary
44(1)
4 Correntropy and Linear Representation
45(16)
4.1 Correntropy
45(3)
4.1.1 Properties of Correntropy
46(1)
4.1.2 Correntropy Minimization
47(1)
4.2 Correntropy Induced Metric (CIM)
48(3)
4.2.1 CIM and l0-Norm
49(1)
4.2.2 CIM and M-Estimation
50(1)
4.3 Linear Representation
51(3)
4.3.1 Linear Least Squares
51(1)
4.3.2 Linear Representation Classification
52(2)
4.4 Robust Linear Representation via Correntropy
54(6)
4.4.1 e1 Estimator and Huber M-Estimator
55(2)
4.4.2 Parameter Selection
57(1)
4.4.3 Stability of Linear Representation Methods
58(2)
4.5 Summary
60(1)
5 l1 Regularized Correntropy
61(24)
5.1 Sparse Signal Reconstruction
61(6)
5.1.1 l1 Minimization
61(1)
5.1.2 l1-Minimization via Half-Quadratic Optimization
62(2)
5.1.3 Numerical Results
64(3)
5.2 Robust Sparse Representation
67(1)
5.2.1 Error Correction
67(1)
5.2.2 Error Detection
67(1)
5.3 Robust Sparse Representation via Correntropy
68(5)
5.3.1 Error Correction
69(2)
5.3.2 Error Detection
71(2)
5.3.3 An Active Set Algorithm
73(1)
5.4 Numerical Results
73(10)
5.4.1 Sparse Representation Algorithms
74(4)
5.4.2 Phase Transition Diagrams
78(1)
5.4.3 Sunglasses Disguise
79(1)
5.4.4 Parameter Setting, Sparsity, and Robustness
80(3)
5.5 Summary
83(2)
6 Correntropy with Nonnegative Constraint
85(18)
6.1 Nonnegative Sparse Coding
85(2)
6.2 Robust Nonnegative Sparse Representation
87(3)
6.3 Two-Stage Recognition for Large-Scale Problems
90(7)
6.3.1 Outlier Detection via Correntropy
91(3)
6.3.2 Efficient Nonnegative Sparse Coding
94(1)
6.3.3 Two-Stage Sparse Representation
95(2)
6.4 Numerical Results
97(5)
6.4.1 Sunglasses Disguise
98(2)
6.4.2 Scarf Occlusion
100(1)
6.4.3 Large-Scale Problems
101(1)
6.5 Summary
102(1)
References 103