Atjaunināt sīkdatņu piekrišanu

E-grāmata: Machine Learning Applications in Electronic Design Automation

Edited by , Edited by
  • Formāts: PDF+DRM
  • Izdošanas datums: 01-Jan-2023
  • Izdevniecība: Springer International Publishing AG
  • Valoda: eng
  • ISBN-13: 9783031130748
Citas grāmatas par šo tēmu:
  • Formāts - PDF+DRM
  • Cena: 118,37 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
  • Formāts: PDF+DRM
  • Izdošanas datums: 01-Jan-2023
  • Izdevniecība: Springer International Publishing AG
  • Valoda: eng
  • ISBN-13: 9783031130748
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

?This book serves as a single-source reference to key machine learning (ML) applications and methods in digital and analog design and verification. Experts from academia and industry cover a wide range of the latest research on ML applications in electronic design automation (EDA), including analysis and optimization of digital design, analysis and optimization of analog design, as well as functional verification, FPGA and system level designs, design for manufacturing (DFM), and design space exploration. The authors also cover key ML methods such as classical ML, deep learning models such as convolutional neural networks (CNNs), graph neural networks (GNNs), generative adversarial networks (GANs) and optimization methods such as reinforcement learning (RL) and Bayesian optimization (BO). All of these topics are valuable to chip designers and EDA developers and researchers working in digital and analog designs and verification.  


Introduction.- Analysis of Digital Design: Routability Optimization for Industrial Designs at Sub-14nm Process Nodes Using Machine Learning.- RouteNet: Routability Prediction for Mixed-size Designs Using Convolutional Neural Network.- High Performance Graph Convolutional networks with Applications in Testability Analysis.- MAVIREC: ML-Aided Vectored IR-Drop Estimation and Classification.- GRANNITE: Graph Neural Network Inference for Transferable Power Estimation.- Machine Learning-Enabled High-Frequency Low-Power Digital Design Implementation at Advanced Process Nodes.- Optimization of Digital Design: Chip Placement with Deep Reinforcement learning.- DREAMPlace: Deep Learning Toolkit-Enabled GPU Acceleration for Modern VLSI Placement.- TreeNet: Deep Point Cloud Embedding for Routing Tree Construction.- Asynchronous Reinforcement Learning Framework for Net Order Exploration in Detailed Routing.- Standard Cell Routing with Reinforcement Learning and Genetic Algorithm in Advanced Technology Nodes.- PrefixRL: Optimization of Parallel Prefix Circuits using Deep Reinforcement Learning.- GAN-CTS: A Generative Adversarial Framework for Clock Tree Prediction and Optimization.- Analysis and Optimization of Analog Design: Machine Learning Techniques in Analog Layout Automation.- Layout Symmetry Annotation for Analog Circuits with Graph Neural Networks.- ParaGraph: Layout parasitics and device parameter prediction using graph neural network.- GCN-RL circuit designer: Transferable transistor sizing with graph neural networks and reinforcement learn.- Parasitic-Aware Analog Circuit Sizing with Graph Neural Networks and Bayesian Optimization.- Logic and Physical Verification: Deep Predictive Coverage Collection/ Dynamically Optimized Test Generation Using Machine Learning.- Novelty-Driven Verification: Using Machine Learning to Identify Novel Stimuli and Close Coverage.- Using Machine Learning Clustering To Find Large Coverage Holes.- GAN-OPC: Mask optimization with lithography-guided generative adversarial nets.- Layout hotspot detection with feature tensor generation and deep biased learning.

Haoxing Ren (Mark) was born in Nanchang, China in 1976. He received two BS degrees in Electrical Engineering and Finance, and MS degree in Electrical Engineering from Shanghai Jiao Tong University, China in 1996, and 1999, respectively; MS in Computer Engineering from Rensselaer Polytechnic Institute in 2000; and PhD in Computer Engineering from University of Texas at Austin in 2006. From 2000 to 2015, he worked at IBM Microelectronics and Thomas J. Watson Research Center (after 2006) developing physical design and logic synthesis tools and methodology for IBM microprocessor and ASIC designs. He received several IBM technical achievement awards including the IBM Corporate Award for his work on improving microprocessor design productivity. After his 15 years tenue at IBM, he had a brief stint as a technical executive at a chip design start-up developing server-class CPUs based on IBM OpenPOWER technology. In 2016, Mark joined NVIDIA Research where he currently leads theDesign Automation research group, whose mission is to improve the quality and productivity of chip design through machine learning and GPU accelerated tools. He published many papers in the field of design automation including several book chapters in logic synthesis and physical design. He also received the best paper awards at International Symposium on Physical Design (ISPD) in 2013, Design Automation Conference (DAC) in 2019 and IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems in 2021. 





Jiang Hu received the B.S. degree in optical engineering from Zhejiang University (China) in 1990, the M.S. degree in physics in 1997 and the Ph.D. degree in electrical engineering from the University of Minnesota in 2001. He worked with IBM Microelectronics from January 2001 to June 2002.





In 2002 he joined the electrical engineering faculty at Texas A&M University. His research interests include design automation of VLSI circuits and systems, computer architecture, hardware security and machine learning applications. Honors include receiving a best paper award at the ACM/IEEE Design Automation Conference in 2001, an IBM Invention Achievement Award in 2003, a best paper award at the IEEE/ACM International Conference on Computer-Aided Design in 2011, a best paper award at the IEEE International Conference on Vehicular Electronics and Safety in 2018 and a best paper award at the IEEE/ACM International Symposium on Microarchitecture in 2021. He has served as technical program committee members for DAC, ICCAD, ISPD, ISQED, ICCD, DATE, ISCAS, ASP-DAC and ISLPED. He is the general chair for the 2012 ACM International Symposium on Physical Design. He served as an associate editor for IEEE Transactions on CAD and the ACM Transactions on Design Automation of Electronic Systems. He received the Humboldt Research Fellowship in 2012. He was named an IEEE Fellow in 2016.