Atjaunināt sīkdatņu piekrišanu

E-grāmata: Advances in Graph Neural Networks

  • Formāts - EPUB+DRM
  • Cena: 65,42 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

This book provides a comprehensive introduction to the foundations and frontiers of graph neural networks. In addition, the book introduces the basic concepts and definitions in graph representation learning and discusses the development of advanced graph representation learning methods with a focus on graph neural networks. The book providers researchers and practitioners with an understanding of the fundamental issues as well as a launch point for discussing the latest trends in the science. The authors emphasize several frontier aspects of graph neural networks and utilize graph data to describe pairwise relations for real-world data from many different domains, including social science, chemistry, and biology. Several frontiers of graph neural networks are introduced, which enable readers to acquire the needed techniques of advances in graph neural networks via theoretical models and real-world applications. 
1 Introduction
1(12)
1.1 Basic Concepts
1(7)
1.1.1 Graph Definitions and Properties
1(2)
1.1.2 Complex Graphs
3(3)
1.1.3 Computational Tasks on Graphs
6(2)
1.2 Development of Graph Neural Network
8(2)
1.2.1 History of Graph Representation Learning
8(1)
1.2.2 Frontier of Graph Neural Networks
9(1)
1.3 Organization of the Book
10(3)
2 Fundamental Graph Neural Networks
13(14)
2.1 Introduction
13(1)
2.2 Graph Convolutional Network
14(3)
2.2.1 Overview
14(1)
2.2.2 The GCN Method
15(2)
2.3 Inductive Graph Convolution Network
17(3)
2.3.1 Overview
17(1)
2.3.2 The GraphSAGE Method
18(2)
2.4 Graph Attention Network
20(3)
2.4.1 Overview
21(1)
2.4.2 The GAT Method
21(2)
2.5 Heterogeneous Graph Attention Network
23(4)
2.5.1 Overview
23(1)
2.5.2 The HAN Method
24(3)
3 Homogeneous Graph Neural Networks
27(34)
3.1 Introduction
27(1)
3.2 Adaptive Multi-channel Graph Convolutional Networks
28(8)
3.2.1 Overview
28(1)
3.2.2 Investigation
29(1)
3.2.3 The AM-GCN Method
30(4)
3.2.4 Experiments
34(2)
3.3 Beyond Low-Frequency Information in Graph Convolutional Networks
36(6)
3.3.1 Overview
36(1)
3.3.2 Investigation
37(1)
3.3.3 The FAGCN Method
38(2)
3.3.4 Experiments
40(2)
3.4 Graph Structure Estimation Neural Networks
42(8)
3.4.1 Overview
42(1)
3.4.2 The GEN Method
43(6)
3.4.3 Experiments
49(1)
3.5 Interpreting and Unifying GNNs with An Optimization Framework
50(8)
3.5.1 Overview
50(1)
3.5.2 Preliminary
51(2)
3.5.3 The GNN-LF/HF Method
53(3)
3.5.4 Experiments
56(2)
3.6 Conclusion
58(1)
3.7 Further Reading
58(3)
4 Heterogeneous Graph Neural Networks
61(26)
4.1 Introduction
61(1)
4.2 Heterogeneous Graph Propagation Network
62(7)
4.2.1 Overview
62(1)
4.2.2 The HPN Method
63(4)
4.2.3 Experiments
67(2)
4.3 Heterogeneous Graph Neural Network with Distance Encoding
69(7)
4.3.1 Overview
69(1)
4.3.2 The DHN Method
70(3)
4.3.3 Experiments
73(3)
4.4 Self-supervised HGNN with Co-contrastive Learning
76(8)
4.4.1 Overview
76(1)
4.4.2 The HeCo Method
76(5)
4.4.3 Experiments
81(3)
4.5 Conclusion
84(1)
4.6 Further Reading
85(2)
5 Dynamic Graph Neural Networks
87(22)
5.1 Introduction
87(1)
5.2 Micro- and Macro-dynamics
88(6)
5.2.1 Overview
88(1)
5.2.2 The M2DNE Method
89(3)
5.2.3 Experiments
92(2)
5.3 Heterogeneous Hawkes Process
94(6)
5.3.1 Overview
94(1)
5.3.2 The HPGE Method
95(3)
5.3.3 Experiments
98(2)
5.4 Dynamic Meta-Path
100(7)
5.4.1 Overview
100(1)
5.4.2 The DyMGNN Method
101(3)
5.4.3 Experiments
104(3)
5.5 Conclusion
107(1)
5.6 Further Reading
107(2)
6 Hyperbolic Graph Neural Networks
109(22)
6.1 Introduction
109(1)
6.2 Hyperbolic Graph Attention Network
110(6)
6.2.1 Overview
110(1)
6.2.2 The HAT Method
111(3)
6.2.3 Experiments
114(2)
6.3 Lorentzian Graph Convolutional Network
116(6)
6.3.1 Overview
116(1)
6.3.2 The LGCN Method
117(3)
6.3.3 Experiments
120(2)
6.4 Hyperbolic Heterogeneous Graph Representation
122(6)
6.4.1 Overview
122(2)
6.4.2 The HHNE Method
124(2)
6.4.3 Experiments
126(2)
6.5 Conclusion
128(1)
6.6 Further Reading
129(2)
7 Distilling Graph Neural Networks
131(22)
7.1 Introduction
131(1)
7.2 Prior-Enhanced Knowledge Distillation for GNNs
132(5)
7.2.1 Overview
132(1)
7.2.2 The CPF Method
132(4)
7.2.3 Experiments
136(1)
7.3 Temperature-Adaptive Knowledge Distillation for GNNs
137(7)
7.3.1 Overview
137(2)
7.3.2 The LTD Method
139(3)
7.3.3 Experiments
142(2)
7.4 Data-Free Adversarial Knowledge Distillation for GNNs
144(6)
7.4.1 Overview
144(1)
7.4.2 The DFAD-GNN Method
144(3)
7.4.3 Experiments
147(3)
7.5 Conclusion
150(1)
7.6 Further Reading
151(2)
8 Platforms and Practice of Graph Neural Networks
153(26)
8.1 Introduction
153(1)
8.2 Foundation
154(11)
8.2.1 Deep Learning Platforms
154(5)
8.2.2 Platforms of Graph Neural Networks
159(3)
8.2.3 GammaGL
162(3)
8.3 Practice of Graph Neural Networks on GammaGL
165(12)
8.3.1 Create Your Own Graph
166(1)
8.3.2 Create Message-Passing Network
167(1)
8.3.3 Advanced Mini-Batching
168(1)
8.3.4 Practice of GIN
169(2)
8.3.5 Practice of GraphSAGE
171(3)
8.3.6 Practice of HAN
174(3)
8.4 Conclusion
177(2)
9 Future Direction and Conclusion
179(6)
9.1 Future Direction
179(4)
9.1.1 Self-supervised Learning on Graphs
179(1)
9.1.2 Robustness
180(1)
9.1.3 Explainability
180(1)
9.1.4 Fairness
181(1)
9.1.5 Biochemistry
182(1)
9.1.6 Physics
182(1)
9.2 Conclusion
183(2)
References 185
Chuan Shi, PhD., is a Professor and Deputy Director of Beijing Key Lab of Intelligent Telecommunications Software and Multimedia at the Beijing University of Posts and Telecommunications.  He received his B.S. from Jilin University in 2001, his M.S. from Wuhan University in 2004, and his Ph.D. from the ICT of Chinese Academic of Sciences in 2007.  His research interests include data mining, machine learning, and evolutionary computing. He has published more than 100 papers in refereed journals and conferences. Xiao Wang, Ph.D., is an Associate Professor in the School of Computer Science at the Beijing University of Posts and Telecommunications. He received his Ph.D. from the School of Computer Science and Technology at Tianjin University in 2016. He was a postdoctoral researcher in the Department of Computer Science and Technology at Tsinghua University.  His current research interests include data mining, social network analysis, and machine learning. He has published more than 70 papers in refereed journals and conferences. Cheng Yang, Ph.D., is an Associate Professor at the Beijing University of Posts and Telecommunications. He received his B.E. and Ph.D. from Tsinghua University in 2014 and 2019, respectively. His research interests include natural language processing and network representation learning. He has published more than 20 top-level papers in international journals and conferences including ACM TOIS, EMNLP, IJCAI, and AAAI.