Atjaunināt sīkdatņu piekrišanu

Neural Information Processing: 25th International Conference, ICONIP 2018, Siem Reap, Cambodia, December 1316, 2018, Proceedings, Part II 2018 ed. [Mīkstie vāki]

Edited by , Edited by , Edited by
  • Formāts: Paperback / softback, 735 pages, height x width: 235x155 mm, weight: 1145 g, 175 Illustrations, color; 89 Illustrations, black and white; XXII, 735 p. 264 illus., 175 illus. in color., 1 Paperback / softback
  • Sērija : Lecture Notes in Computer Science 11302
  • Izdošanas datums: 18-Nov-2018
  • Izdevniecība: Springer Nature Switzerland AG
  • ISBN-10: 3030041786
  • ISBN-13: 9783030041786
Citas grāmatas par šo tēmu:
  • Mīkstie vāki
  • Cena: 46,91 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Standarta cena: 55,19 €
  • Ietaupiet 15%
  • Grāmatu piegādes laiks ir 3-4 nedēļas, ja grāmata ir uz vietas izdevniecības noliktavā. Ja izdevējam nepieciešams publicēt jaunu tirāžu, grāmatas piegāde var aizkavēties.
  • Daudzums:
  • Ielikt grozā
  • Piegādes laiks - 4-6 nedēļas
  • Pievienot vēlmju sarakstam
  • Formāts: Paperback / softback, 735 pages, height x width: 235x155 mm, weight: 1145 g, 175 Illustrations, color; 89 Illustrations, black and white; XXII, 735 p. 264 illus., 175 illus. in color., 1 Paperback / softback
  • Sērija : Lecture Notes in Computer Science 11302
  • Izdošanas datums: 18-Nov-2018
  • Izdevniecība: Springer Nature Switzerland AG
  • ISBN-10: 3030041786
  • ISBN-13: 9783030041786
Citas grāmatas par šo tēmu:

The seven-volume set of LNCS 11301-11307, constitutes the proceedings of the 25th International Conference on Neural Information Processing, ICONIP 2018, held in Siem Reap, Cambodia, in December 2018.

The 401 full papers presented were carefully reviewed and selected from 575 submissions. The papers address the emerging topics of theoretical research, empirical studies, and applications of neural information processing techniques across different domains. The second volume, LNCS 11302, is organized in topical sections on other neural network models, stability analysis, optimization, and supervised learning.