Atjaunināt sīkdatņu piekrišanu

Translation, Brains and the Computer: A Neurolinguistic Solution to Ambiguity and Complexity in Machine Translation 2018 ed. [Hardback]

  • Formāts: Hardback, 241 pages, height x width: 235x155 mm, weight: 559 g, 55 Illustrations, black and white; XVI, 241 p. 55 illus., 1 Hardback
  • Sērija : Machine Translation: Technologies and Applications 2
  • Izdošanas datums: 15-Jun-2018
  • Izdevniecība: Springer International Publishing AG
  • ISBN-10: 3319766287
  • ISBN-13: 9783319766287
  • Hardback
  • Cena: 127,23 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Standarta cena: 149,69 €
  • Ietaupiet 15%
  • Grāmatu piegādes laiks ir 3-4 nedēļas, ja grāmata ir uz vietas izdevniecības noliktavā. Ja izdevējam nepieciešams publicēt jaunu tirāžu, grāmatas piegāde var aizkavēties.
  • Daudzums:
  • Ielikt grozā
  • Piegādes laiks - 4-6 nedēļas
  • Pievienot vēlmju sarakstam
  • Formāts: Hardback, 241 pages, height x width: 235x155 mm, weight: 559 g, 55 Illustrations, black and white; XVI, 241 p. 55 illus., 1 Hardback
  • Sērija : Machine Translation: Technologies and Applications 2
  • Izdošanas datums: 15-Jun-2018
  • Izdevniecība: Springer International Publishing AG
  • ISBN-10: 3319766287
  • ISBN-13: 9783319766287

This book is about machine translation (MT) and the classic problems associated with this language technology. It examines the causes of these problems and, for linguistic, rule-based systems, attributes the cause to language’s ambiguity and complexity and their interplay in logic-driven processes. For non-linguistic, data-driven systems, the book attributes translation shortcomings to the very lack of linguistics. It then proposes a demonstrable way to relieve these drawbacks in the shape of a working translation model (Logos Model) that has taken its inspiration from key assumptions about psycholinguistic and neurolinguistic function. The book suggests that this brain-based mechanism is effective precisely because it bridges both linguistically driven and data-driven methodologies. It shows how simulation of this cerebral mechanism has freed this one MT model from the all-important, classic problem of complexity when coping with the ambiguities of language. Logos Model accomplishes this by a data-driven process that does not sacrifice linguistic knowledge, but that, like the brain, integrates linguistics within a data-driven process. As a consequence, the book suggests that the brain-like mechanism embedded in this model has the potential to contribute to further advances in machine translation in all its technological instantiations.

Recenzijas

Natural language processing is one of the most rapidly evolving areas of artificial intelligence, and is the subject of this excellent book. One of the important contributions of this valuable resource is its presentation and comparison of many current state-of-the-art machine translation systems available to the general public. Summing Up: Recommended. Advanced undergraduates through faculty and professionals. (J. Brzezinski, Choice, Vol. 56 (6), February, 2019)









Part I
1 Introduction
3(10)
References
11(2)
2 Background
13(28)
2.1 Logos Model Beginnings
13(3)
2.2 The Advent of Statistical MT
16(4)
2.2.1 Pattern-Based Processes in SMT and Logos Model
17(3)
2.3 Overview of Logos Model Translation Process
20(5)
2.4 Psycholinguistic and Neurolinguistic Assumptions
25(2)
2.4.1 First Assumption
25(1)
2.4.2 Second Assumption
26(1)
2.4.3 Third Assumption
26(1)
2.5 On Language and Grammar
27(5)
2.5.1 The Origin and Nature of Grammar
28(1)
2.5.2 Language, Grammar and Associative Memory
29(1)
2.5.3 In Principio Erat Verbum
30(2)
2.6 A Note About Neural MT (NMT)
32(1)
2.7 Conclusion
33(8)
Postscripts
34(1)
Postscript 2-A
34(3)
Postscript 2-B
37(1)
Postscript 2-C
37(1)
References
38(3)
3 Language and Ambiguity: Psycholinguistic Perspectives
41(24)
3.1 Levels of Ambiguity
41(3)
3.2 Language Acquisition and Translation
44(7)
3.2.1 Linguistic Processes Involved in Second Language Acquisition
45(4)
3.2.2 On Learning French
49(2)
3.3 Psycholinguistic Bases of Language Skills
51(3)
3.3.1 Analytical Basis
51(1)
3.3.2 Empirical Basis
52(1)
3.3.3 Analogical Basis
52(2)
3.4 Practical Implications for MT
54(4)
3.4.1 Semantico-Syntactic Solutions to the Problem of Ambiguity in MT
55(3)
3.5 Psycholinguisitcs in a Machine
58(5)
3.5.1 Generate Target Translation
61(1)
3.5.2 OpenLogos and Logos Model
61(2)
3.6 Conclusion
63(2)
References
63(2)
4 Language and Complexity: Neurolinguistic Perspectives
65(34)
4.1 On Cognitive Complexity
65(5)
4.2 A Role for Semantic Abstraction and Generalization
70(3)
4.3 Connectionism and Brain Simulation
73(3)
4.4 Logos Model As a Neural Network
76(3)
4.5 Language Processing in the Brain
79(10)
4.5.1 Cortical Circuits and Logos Model
80(1)
4.5.2 Hippocampus and Logos Model
81(8)
4.6 MT Performance and Underlying Competence
89(2)
4.7 Conclusion
91(8)
Postscripts
91(1)
Postscript 4-A
91(1)
Postscript 4-B
92(2)
Postscript 4-C
94(1)
Postscript 4-D
94(1)
Postscript 4-E
94(1)
Postscript 4-F
95(1)
Postscript 4-G
95(1)
Postscript 4-H
96(1)
References
96(3)
5 Syntax and Semantics: Dichotomy Versus Integration
99(28)
5.1 Syntax Versus Semantics: Is There a Third, Semantico-Syntactic Perspective?
99(7)
5.2 Recent Views of the Cerebral Process
106(2)
5.3 Syntax and Semantics: How Do They Relate?
108(5)
5.4 Conclusion
113(14)
Postscripts
114(1)
Postscript 5-A
114(1)
Postscript 5-B
115(1)
Postscript 5-C
115(1)
Postscript 5-D
116(1)
Postscript 5-E
117(1)
Postscript 5-F
118(1)
Postscript 5-G
119(4)
References
123(4)
6 Logos Model: Design and Performance
127(36)
6.1 The Translation Problem
127(3)
6.1.1 Five Fundamental Design Decisions
129(1)
6.2 How Do You Represent Natural Language?
130(3)
6.2.1 Effectiveness of SAL for Deterministic Parsing
131(2)
6.3 How Do You Store Linguistic Knowledge?
133(4)
6.3.1 General Remarks
133(1)
6.3.2 Logos Model Lexicon
133(1)
6.3.3 The Pattern-Rule Database
134(3)
6.4 How Do You Apply Stored Knowledge to the Input Stream?
137(16)
6.4.1 Modules RES 1 and RES2 (R1 and R2)
138(2)
6.4.2 Module PARSE 1
140(2)
6.4.3 Module PARSE 2
142(3)
6.4.4 Module PARSE 3
145(4)
6.4.5 Module PARSE 4
149(4)
6.5 How Do You Effect Target Generation?
153(1)
6.5.1 Target Components
153(1)
6.6 How Do You Cope with Complexity?
153(4)
6.6.1 A Final Illustration
154(3)
6.7 Conclusion
157(6)
Postscripts
157(1)
Postscript 6-A
157(1)
Postscript 6-B
158(4)
References
162(1)
7 Some Limits on Translation Quality
163(10)
7.1 First Example
164(2)
7.2 Second Example
166(1)
7.3 Other Translation Examples
167(1)
7.4 Balancing the Picture
168(1)
7.5 Conclusion
169(4)
References
171(2)
8 Deep Learning MT and Logos Model
173(32)
8.1 Points of Similarity and Differences
174(5)
8.2 Deep Learning, Logos Model and the Brain
179(2)
8.3 On Learning
181(5)
8.4 The Hippocampus and Continual Learning
186(7)
8.5 Conclusion
193(3)
8.6 A Final Demonstration
196(9)
References
200(5)
Part II
9 The SAL Representation Language
205
9.1 Overview of SAL
205(1)
9.2 SAL Parts of Speech
206(1)
9.2.1 Open Classes (Table 9.1)
206(1)
9.2.2 Closed Classes (Table 9.2)
206(1)
9.3 SAL Nouns (WC 1)
207(11)
9.3.1 Aspective Nouns
209(2)
9.3.2 Concrete Nouns
211(1)
9.3.3 Animate Nouns
212(1)
9.3.4 Abstract Nouns
213(1)
9.3.5 Measure Nouns
214(1)
9.3.6 Place Nouns
215(1)
9.3.7 Mass Nouns
216(1)
9.3.8 Information and Time Nouns
217(1)
9.4 SAL Verbs (WC 2)
218(15)
9.4.1 The Intransitive-Transitive Verb Spectrum
219(3)
9.4.2 Intransitive Verbs
222(1)
9.4.3 Subjective Transitive Verbs
223(1)
9.4.4 Reciprocal Transitive Verbs
224(1)
9.4.5 Ditransitive Verbs
225(1)
9.4.6 Objective Transitive Verbs
226(1)
9.4.7 Pre-process Verbs
227(1)
9.4.8 Simple Preverbal Verbs
228(1)
9.4.9 Preverbal Complex Verbs
229(1)
9.4.10 Preverbal-Preclausal Verbs
230(1)
9.4.11 Preclausal Verbs
231(2)
9.5 SAL Adjectives (WC 4)
233(5)
9.5.1 Preclausal/Preverbal Adjectives
234(1)
9.5.2 Preverbal Adjectives
235(1)
9.5.3 Adverbial Adjectives
236(1)
9.5.4 Non-adverbial Adjectives
237(1)
9.6 SAL Adverbs (WC 3 and WC 6)
238
9.6.1 Locative Adverbs have the Following Supersets
238(1)
9.6.2 Non-locative Adverb have the Following Supersets
239(1)
Postscript
240(1)
Postscript 9-A
240