Kriso.lv interneta veikals karantīnas laikā darbību nepārtrauc, pasūtījumi tiek pieņemti visu diennakti. Arī pasūtījumu piegāde notiek kā parasti – ar Omniva kurjerdienesta starpniecību.

Engineering Autonomous Vehicles and Robots: The DragonFly Modular-based Approach [Hardback]

  • Formāts: Hardback, 216 pages, height x width x depth: 249x172x18 mm, weight: 540 g
  • Sērija : Wiley - IEEE
  • Izdošanas datums: 26-Mar-2020
  • Izdevniecība: Wiley-Blackwell
  • ISBN-10: 1119570565
  • ISBN-13: 9781119570561
Citas grāmatas par šo tēmu:
  • Hardback
  • Cena: 104,56 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Standarta cena: 123,02 €
  • Ietaupiet 15%
  • Pievienot vēlmju sarakstam
  • Grāmatu piegādes laiks ir 3-4 nedēļas, ja grāmata ir uz vietas izdevniecības noliktavā. Ja izdevējam nepieciešams publicēt jaunu tirāžu, grāmatas piegāde var aizkavēties.
  • Ielikt grozā
  • Daudzums:
  • Piegādes laiks - 4-6 nedēļas
  • Formāts: Hardback, 216 pages, height x width x depth: 249x172x18 mm, weight: 540 g
  • Sērija : Wiley - IEEE
  • Izdošanas datums: 26-Mar-2020
  • Izdevniecība: Wiley-Blackwell
  • ISBN-10: 1119570565
  • ISBN-13: 9781119570561
Citas grāmatas par šo tēmu:

Offers a step-by-step guide to building autonomous vehicles and robots, with source code and accompanying videos

The first book of its kind on the detailed steps for creating an autonomous vehicle or robot, this book provides an overview of the technology and introduction of the key elements involved in developing autonomous vehicles, and offers an excellent introduction to the basics for someone new to the topic of autonomous vehicles and the innovative, modular-based engineering approach called DragonFly.

Engineering Autonomous Vehicles and Robots: The DragonFly Modular-based Approach covers everything that technical professionals need to know about: CAN bus, chassis, sonars, radars, GNSS, computer vision, localization, perception, motion planning, and more. Particularly, it covers Computer Vision for active perception and localization, as well as mapping and motion planning. The book offers several case studies on the building of an autonomous passenger pod, bus, and vending robot. It features a large amount of supplementary material, including the standard protocol and sample codes for chassis, sonar, and radar. GPSD protocol/NMEA protocol and GPS deployment methods are also provided. Most importantly, readers will learn the philosophy behind the DragonFly modular-based design approach, which empowers readers to design and build their own autonomous vehicles and robots with flexibility and affordability.

  • Offers progressive guidance on building autonomous vehicles and robots
  • Provides detailed steps and codes to create an autonomous machine, at affordable cost, and with a modular approach
  • Written by one of the pioneers in the field building autonomous vehicles
  • Includes case studies, source code, and state-of-the art research results
  • Accompanied by a website with supplementary material, including sample code for chassis/sonar/radar; GPS deployment methods; Vision Calibration methods

Engineering Autonomous Vehicles and Robots is an excellent book for students, researchers, and practitioners in the field of autonomous vehicles and robots.

Chapter 1: Affordable and Reliable Autonomous Driving through Modular Design
1. Introduction 1
2. High Cost of Autonomous Driving Technologies 1 2.1. Sensing 2 2.2. HD Map Creation and Maintenance 2 2.3. Computing Systems 3
3. Achieving Affordability and Reliability 3 3.1. Sensor Fusion 4 3.2. Modular Design 5 3.3. Extending Existing Digital Maps 5
4. Modular Design 6 4.1. Communication System 7 4.2. Chassis 7 4.3. mmWave Radar and Sonar for Passive Perception 7 4.4. GNSS for Localization 8 4.5. Computer Vision for Active Perception and Localization 8 4.6. Planning & Control 8 4.7. Mapping 9
5. The Rest of the Book 9
6. Open Source Projects Used in this Book 10 References 11
Chapter 2: In-Vehicle Communication Systems
1. Introduciton
2. CAN
3. Flexray 4 3.1 FlexRay Topology 4 3.2 The FlexRay Communication Protocol 5
4. CANopen 6 4.1 Object Dictionary 7 4.2 Profile Family 7 4.3 Data Transmission and Network Management 8 4.4 Communication Models 8 4.5 CANopenNode 9 References 10
Chapter 3: Chassis Technologies for Autonomous Robots and Vehicles
1. Introduction 1
2. Throttle-by-Wire 2
3. Brake-by-Wire 2
4. Steer-by-Wire 3
4. Open Source Car Control 4 4.1 OSCC APIs 4 4.2. Hardware 5 4.3. Firmware 6
5. OpenCaret 7 5.1. OSCC-Throttle 7 5.2. OSCC-Brake 7 5.3. OSCC-Steering 8
6. PerceptIn Chassis Software Adaptation Layer 9 References 13
Chapter 4: Passive Perception with Sonar and mmWave Radar
1. Introduction 1
2. The Fundamentals of mmWave Radar 2 2.1 Range Measurement 2 2.2 Velocity Measurement 3 2.3 Angle Detection 3
3. mmWave Radar Deployment 3
4. Sonar Deployment 7 References 10
Chapter 5: Localization with RTK GNSS
1. Introduction 1
2. GNSS Technology Overview 1
3. Real-Time Kinematic (RTK) GNSS 3
4. RTK-GNSS NtripCaster Setup Steps? 5 4.1 Setup NtripCaster 6 4.2 Start NtripCaster 8
5. Setting up NtripServer and NtripClient on Raspberry Pi 9 5.1 Install the Raspberry Pi system 9 5.2. Run RTKLIB-str2str on the Raspberry Pi 11
6. Setting up a base station and a GNSS rover 14 6.1 Base Station Hardware Setup 14 6.2 Base Station Software Setup 16 6.3 GNSS Rover Setup 22
7. FreeWave Radio Basic Configuration 25 References 30
Chapter 6: Computer Vision for Perception and Localization
1. Introduction 1
2. Building Computer Vision Hardware 1 2.1 Seven Layers of Technologies 2 2.2 Hardware Synchronization 3 2.3 Computing 4
3. Calibration 4 3.1 Intrinsic Parameters 5 3.2 Extrinsic Parameters 5 3.3 Kalibr 6 3.3.1 Calibration Target 6 3.3.2 Multiple Camera Calibration 6 3.3.3 Camera IMU Calibration 7 3.3.4 Multi-IMU and IMU Intrinsic Calibration 7
4. Localization with Computer Vision 8 4.1 VSLAM Overview 8 4.2 ORB-SLAM2 9 4.2.1 Prerequisites 9 4.2.2 Building ORB-SLAM2 Library 10 4.2.3 Running Stereo Datasets 10
5. Perception with Computer Vision 11 5.1 ELAS for Stereo Depth Perception 11 5.2 Mask R-CNN for Object Instance Segmentation 12
6. The DragonFly Computer Vision Module 13 6.1 DragonFly Localization Interface 13 6.3 DragonFly Perception Interface 14 6.4 DragonFly+ 16 References 17
Chapter 7: Planning and Control
1. Introduction 1
2. Route Planning 2 2.1 Weighted Directed Graph 2 2.2 Dijkstra's Algorithm 3 2.3 A* Algorithm 4
3. Behavioral Planning 5 3.1 Markov Decision Process (MDP) 6 3.2 Value Iteration Algorithm 6 3.3 Partially-Observable Markov Decision Process (POMDP) 7 3.4 Solving POMDP 8
4. Motion Planning 9 4.1 Rapidly-exploring Random Tree (RRT) 9 4.2 RRT* 10
5. Feedback Control 11 5.1 Proportional-Integral-Derivative (PID) Controller 11 5.2 Model Predictive Control ?MPC? 12
6. Iterative EM Plannning System in Apollo 13 6.1 Terminologies 14 6.1.1 Path and Trajectory 14 6.1.2 SL Coordinate System and Reference Line 14 6.1.3 ST Graph 15 6.2 Iterative EM Planning Algorithm 15 6.2.1 Traffic Decider 16 6.2.2 QP Path and QP Speed 18
7. PerceptIn's Planning and Control Framework 19 References 21
Chapter 8: Mapping
1. Introduction 1
2. Digital Maps 2 2.1 Open Street Map (OSM) 2 2.1.1 OSM Data Structures 2 2.1.2 OSM Software Stack 3 2.2 Java OpenStreetMap Editor (JOSM) 3 2.2.1 Adding a Node or a Way 4 2.2.2 Adding Tags 4 2.2.3 Uploading to OSM 5 2.3 Nominatim 5 2.3.1 Nominatim Architecture 5 2.3.2 Place Ranking in Nominatim 6
3. High-Definition (HD) Maps 6 3.1 Characteristics of HD Maps 7 3.1.1 High Precision 7 3.1.2 Rich Geometric Information and Semantics 7 3.1.3 Fresh Data 7 3.2 Layers of HD Maps 7 3.2.1 2D Orthographic Reflectivity Map 7 3.2.2 Digital Elevation Model (DEM) 8 3.2.3 Lane/Road Model 8 3.2.4 Stationary Map 8 3.3 HD Map Creation 8 3.3.1 Data Collection 9 3.3.2 Offline Generation of HD Maps 9 3.3.3 Quality Control and Validation 10 3.3.4 Update and Maintenance 11 3.3.5 Problems of HD maps 11
4. PerceptIn's -Map 11 4.1 Topological Map 11 4.2 -Map Creation 12 References 14
Chapter 9: Building DragonFly Pod and Bus
1. Introduction 1
2. Chassis Hardware Specifications 1
3. Sensor Configurations 4
4. Software Architecture 6
5. Mechanism 7
6. Data Structures 8 6.1 Common Data Structures 9 6.2 Chassis Data 10 6.3 Localization Data 13 6.4 Perception Data 14 6.5 Planning Data 17
7. User Interface 22 References 23
Chapter 10: Enabling Commercial Autonomous Space Robotic Explorers
1. Introduction 1
2. Destination Mars 2
3. Mars Explorer Autonomy 3 3.1 Localization 4 3.2 Perception 5 3.3 Path Planning 6 3.4 The Curiosity Rover and Mars 2020 Explorer 8
4. Challenge: Onboard Computing Capability 10
5. Conclusion 12 References: 13
Chapter 11: Edge Computing for Autonomous Vehicles
1. Introduction 1
2. Benchmarks 1
3. Computing System Architectures 2
4. Runtime 4
5. Middleware 5
6. Case Studies 1 References: 1
Chapter 12: INNOVATIONS ON THE V2X INFRASTRUCTURE
1. Introduction 1
2. Evolution of V2X Technology 1
3. Cooperative Autonomous Driving 4
4. Challenges 6 References: 6
Chapter 13: Vehicular Edge Security
1. Introduction 1
2. Sensor Security 1
3. Operating System Security 1
4. Control System Security 1
5. V2X Security 2
6. Security for Edge Computing 3 References: 4
SHAOSHAN LIU, PHD, is Founder and CEO of PerceptIn, a full-stack visual intelligence company aimed at making scalable hardware/software integrated solutions for autonomous robotics systems. Liu holds a Ph.D. in Computer Engineering from University of California, Irvine and his research focuses on Edge Computing Systems, Robotics, and Autonomous Driving. Liu has over 40 publications and over 100 patents in autonomous systems. Liu is currently a Senior Member of IEEE, an ACM Distinguished Speaker, an IEEE Computer Society Distinguished Visitor, and a co-founder of the IEEE Computer Society Special Technical Community on Autonomous Driving Technologies.