Preface |
|
vii | |
|
|
1 | (12) |
|
1.1 Intelligent Multimedia Database |
|
|
1 | (4) |
|
|
5 | (1) |
|
|
6 | (1) |
|
|
7 | (6) |
|
|
13 | (22) |
|
|
13 | (6) |
|
2.1.1 Structured Query Language SQL |
|
|
15 | (1) |
|
2.1.2 Symbolical artificial intelligence and relational databases |
|
|
16 | (3) |
|
|
19 | (9) |
|
|
19 | (2) |
|
2.2.2 Graphics and digital images |
|
|
21 | (2) |
|
2.2.3 Digital audio and video |
|
|
23 | (4) |
|
|
27 | (1) |
|
2.2.5 Multimedia extender |
|
|
27 | (1) |
|
2.3 Content-Based Multimedia Retrieval |
|
|
28 | (7) |
|
2.3.1 Semantic gap and metadata |
|
|
31 | (4) |
|
|
35 | (56) |
|
|
35 | (18) |
|
3.1.1 Continuous Fourier transform |
|
|
35 | (2) |
|
3.1.2 Discrete Fourier transform |
|
|
37 | (3) |
|
3.1.3 Fast Fourier transform |
|
|
40 | (3) |
|
3.1.4 Discrete cosine transform |
|
|
43 | (2) |
|
3.1.5 Two dimensional transform |
|
|
45 | (8) |
|
|
53 | (25) |
|
3.2.1 Short-term Fourier transform |
|
|
53 | (4) |
|
3.2.2 Continuous wavelet transform |
|
|
57 | (4) |
|
3.2.3 Discrete wavelet transform |
|
|
61 | (9) |
|
3.2.4 Fast wavelet transform |
|
|
70 | (6) |
|
3.2.5 Discrete wavelet transform and images |
|
|
76 | (2) |
|
3.3 The Karhunen-Loeve Transform |
|
|
78 | (9) |
|
3.3.1 The covariance matrix |
|
|
79 | (4) |
|
3.3.2 The Karhunen-Loeve transform |
|
|
83 | (1) |
|
3.3.3 Principal component analysis |
|
|
84 | (3) |
|
|
87 | (4) |
|
|
88 | (3) |
|
|
91 | (14) |
|
|
91 | (5) |
|
|
91 | (2) |
|
|
93 | (1) |
|
4.1.3 Statistical encoding |
|
|
93 | (3) |
|
|
96 | (9) |
|
|
96 | (3) |
|
4.2.2 Digital audio signal |
|
|
99 | (2) |
|
|
101 | (4) |
|
|
105 | (28) |
|
|
105 | (8) |
|
|
105 | (3) |
|
|
108 | (1) |
|
|
109 | (2) |
|
5.1.4 Measurement of angle |
|
|
111 | (1) |
|
5.1.5 Information and contour |
|
|
112 | (1) |
|
|
113 | (3) |
|
|
116 | (1) |
|
|
116 | (7) |
|
|
123 | (1) |
|
5.5 Recognition by Components |
|
|
123 | (1) |
|
|
124 | (3) |
|
5.6.1 Formant frequencies |
|
|
125 | (1) |
|
|
125 | (2) |
|
|
127 | (4) |
|
|
127 | (1) |
|
|
128 | (1) |
|
|
128 | (1) |
|
|
129 | (1) |
|
|
130 | (1) |
|
|
131 | (2) |
|
5.8.1 Dynamic time warping |
|
|
131 | (1) |
|
5.8.2 Dynamic programming |
|
|
132 | (1) |
|
6 Low Dimensional Indexing |
|
|
133 | (38) |
|
6.1 Hierarchical Structures |
|
|
133 | (5) |
|
6.1.1 Example of a taxonomy |
|
|
133 | (1) |
|
6.1.2 Origins of hierarchical structures |
|
|
134 | (4) |
|
|
138 | (9) |
|
|
138 | (1) |
|
6.2.2 Decoupled search tree |
|
|
139 | (1) |
|
|
140 | (1) |
|
|
141 | (6) |
|
|
147 | (9) |
|
|
147 | (5) |
|
|
152 | (1) |
|
|
152 | (1) |
|
6.3.4 High-dimensional space |
|
|
153 | (3) |
|
|
156 | (13) |
|
|
156 | (5) |
|
|
161 | (6) |
|
6.4.3 Fractals and the Hausdorff dimension |
|
|
167 | (2) |
|
|
169 | (2) |
|
|
171 | (10) |
|
7.1 Curse of Dimensionality |
|
|
171 | (2) |
|
7.2 Approximate Nearest Neighbor |
|
|
173 | (1) |
|
7.3 Locality-Sensitive Hashing |
|
|
173 | (4) |
|
7.3.1 Binary Locality-sensitive hashing |
|
|
174 | (2) |
|
7.3.2 Projection-based LSH |
|
|
176 | (1) |
|
7.3.3 Query complexity LSH |
|
|
176 | (1) |
|
7.4 Johnson-Lindenstrauss Lemma |
|
|
177 | (1) |
|
|
178 | (2) |
|
|
180 | (1) |
|
8 High Dimensional Indexing |
|
|
181 | (34) |
|
|
181 | (1) |
|
|
182 | (16) |
|
8.2.1 1-Lipschitz property |
|
|
183 | (2) |
|
8.2.2 Lower bounding approach |
|
|
185 | (3) |
|
8.2.3 Projection operators |
|
|
188 | (1) |
|
8.2.4 Projection onto one-dimensional subspace |
|
|
189 | (5) |
|
|
194 | (3) |
|
|
197 | (1) |
|
|
198 | (13) |
|
|
198 | (2) |
|
8.3.2 Content-based image retrieval by image pyramid |
|
|
200 | (3) |
|
8.3.3 The first principal component |
|
|
203 | (2) |
|
|
205 | (2) |
|
|
207 | (1) |
|
|
208 | (2) |
|
|
210 | (1) |
|
|
211 | (4) |
|
9 Dealing with Text Databases |
|
|
215 | (24) |
|
|
215 | (2) |
|
|
217 | (1) |
|
9.2.1 Low-level tokenization |
|
|
217 | (1) |
|
9.2.2 High-level tokenization |
|
|
218 | (1) |
|
|
218 | (4) |
|
|
218 | (1) |
|
|
219 | (1) |
|
9.3.3 Vector representation |
|
|
220 | (1) |
|
|
220 | (2) |
|
|
222 | (9) |
|
|
222 | (1) |
|
|
223 | (1) |
|
|
224 | (2) |
|
9.4.4 Probability ranking principle |
|
|
226 | (1) |
|
9.4.5 Binary independence model |
|
|
226 | (4) |
|
9.4.6 Stochastic language models |
|
|
230 | (1) |
|
|
231 | (5) |
|
9.5.1 Learning and forgetting |
|
|
232 | (1) |
|
|
233 | (1) |
|
|
234 | (1) |
|
|
235 | (1) |
|
|
236 | (3) |
|
|
236 | (1) |
|
|
237 | (2) |
|
10 Statistical Supervised Machine Learning |
|
|
239 | (30) |
|
10.1 Statistical Machine Learning |
|
|
239 | (2) |
|
10.1.1 Supervised learning |
|
|
239 | (1) |
|
|
240 | (1) |
|
|
241 | (2) |
|
|
243 | (6) |
|
|
245 | (3) |
|
10.3.2 Stochastic gradient descent |
|
|
248 | (1) |
|
10.3.3 Continuous activation functions |
|
|
248 | (1) |
|
10.4 Networks with Hidden Nonlinear Layers |
|
|
249 | (6) |
|
|
250 | (2) |
|
10.4.2 Radial basis function network |
|
|
252 | (2) |
|
10.4.3 Why does a feed-forward networks with hidden nonlinear units work? |
|
|
254 | (1) |
|
|
255 | (1) |
|
10.6 Support Vector Machine |
|
|
256 | (2) |
|
10.6.1 Linear support vector machine |
|
|
256 | (1) |
|
|
257 | (1) |
|
|
257 | (1) |
|
|
258 | (11) |
|
10.7.1 Map transformation cascade |
|
|
258 | (5) |
|
10.7.2 Relation between deep learning and subspace tree |
|
|
263 | (6) |
|
|
269 | (6) |
|
11.1 Constrained Hierarchies |
|
|
269 | (1) |
|
|
270 | (1) |
|
|
270 | (5) |
|
11.3.1 Multimodal fusion and images |
|
|
271 | (1) |
|
11.3.2 Stochastic language model approach |
|
|
271 | (1) |
|
11.3.3 Dempster-Shafer theory |
|
|
272 | (3) |
|
|
275 | (6) |
|
12.1 Database Architecture |
|
|
275 | (1) |
|
12.1.1 Client-server system |
|
|
275 | (1) |
|
|
276 | (1) |
|
|
276 | (2) |
|
12.2.1 Divide and conquer |
|
|
276 | (1) |
|
|
276 | (2) |
|
|
278 | (3) |
|
12.3.1 Precision and recall |
|
|
279 | (2) |
|
13 Multimedia Databases in Medicine |
|
|
281 | (10) |
|
|
281 | (1) |
|
13.1.1 Health Level Seven |
|
|
281 | (1) |
|
|
282 | (1) |
|
|
282 | (1) |
|
13.2 Electronic Health Record |
|
|
282 | (7) |
|
|
283 | (6) |
|
|
289 | (2) |
Bibliography |
|
291 | (10) |
Index |
|
301 | |