Atjaunināt sīkdatņu piekrišanu

E-grāmata: Engineering Autonomous Vehicles and Robots: The DragonFly Modular-based Approach

  • Formāts: PDF+DRM
  • Sērija : IEEE Press
  • Izdošanas datums: 02-Mar-2020
  • Izdevniecība: Wiley-IEEE Press
  • Valoda: eng
  • ISBN-13: 9781119570554
Citas grāmatas par šo tēmu:
  • Formāts - PDF+DRM
  • Cena: 112,96 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
  • Bibliotēkām
  • Formāts: PDF+DRM
  • Sērija : IEEE Press
  • Izdošanas datums: 02-Mar-2020
  • Izdevniecība: Wiley-IEEE Press
  • Valoda: eng
  • ISBN-13: 9781119570554
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

Offers a step-by-step guide to building autonomous vehicles and robots, with source code and accompanying videos

The first book of its kind on the detailed steps for creating an autonomous vehicle or robot, this book provides an overview of the technology and introduction of the key elements involved in developing autonomous vehicles, and offers an excellent introduction to the basics for someone new to the topic of autonomous vehicles and the innovative, modular-based engineering approach called DragonFly.

Engineering Autonomous Vehicles and Robots: The DragonFly Modular-based Approach covers everything that technical professionals need to know about: CAN bus, chassis, sonars, radars, GNSS, computer vision, localization, perception, motion planning, and more. Particularly, it covers Computer Vision for active perception and localization, as well as mapping and motion planning. The book offers several case studies on the building of an autonomous passenger pod, bus, and vending robot. It features a large amount of supplementary material, including the standard protocol and sample codes for chassis, sonar, and radar. GPSD protocol/NMEA protocol and GPS deployment methods are also provided. Most importantly, readers will learn the philosophy behind the DragonFly modular-based design approach, which empowers readers to design and build their own autonomous vehicles and robots with flexibility and affordability.

  • Offers progressive guidance on building autonomous vehicles and robots
  • Provides detailed steps and codes to create an autonomous machine, at affordable cost, and with a modular approach
  • Written by one of the pioneers in the field building autonomous vehicles
  • Includes case studies, source code, and state-of-the art research results
  • Accompanied by a website with supplementary material, including sample code for chassis/sonar/radar; GPS deployment methods; Vision Calibration methods

Engineering Autonomous Vehicles and Robots is an excellent book for students, researchers, and practitioners in the field of autonomous vehicles and robots.

1 Affordable and Reliable Autonomous Driving Through Modular Design
1(12)
1.1 Introduction
1(1)
1.2 High Cost of Autonomous Driving Technologies
2(2)
1.2.1 Sensing
2(1)
1.2.2 HD Map Creation and Maintenance
3(1)
1.2.3 Computing Systems
3(1)
1.3 Achieving Affordability and Reliability
4(2)
1.3.1 Sensor Fusion
4(1)
1.3.2 Modular Design
5(1)
1.3.3 Extending Existing Digital Maps
5(1)
1.4 Modular Design
6(3)
1.4.1 Communication System
7(1)
1.4.2 Chassis
7(1)
1.4.3 MmWave Radar and Sonar for Passive Perception
8(1)
1.4.4 GNSS for Localization
8(1)
1.4.5 Computer Vision for Active Perception and Localization
8(1)
1.4.6 Planning and Control
8(1)
1.4.7 Mapping
9(1)
1.5 The Rest of the Book
9(1)
1.6 Open Source Projects Used in this Book
10(1)
References
11(2)
2 In-Vehicle Communication Systems
13(10)
2.1 Introduction
13(1)
2.2 CAN
13(3)
2.3 FlexRay
16(2)
2.3.1 FlexRay Topology
16(1)
2.3.2 The FlexRay Communication Protocol
17(1)
2.4 CANopen
18(4)
2.4.1 Object Dictionary
19(1)
2.4.2 Profile Family
19(1)
2.4.3 Data Transmission and Network Management
20(1)
2.4.4 Communication Models
21(1)
2.4.5 CANopenNode
21(1)
References
22(1)
3 Chassis Technologies for Autonomous Robots and Vehicles
23(12)
3.1 Introduction
23(1)
3.2 Throttle-by-Wire
23(2)
3.3 Brake-by-Wire
25(1)
3.4 Steer-by-Wire
25(1)
3.5 Open Source Car Control
26(3)
3.5.1 OSCCAPIs
26(1)
3.5.2 Hardware
27(1)
3.5.3 Firmware
28(1)
3.6 OpenCaret
29(1)
3.6.1 OSCC Throttle
29(1)
3.6.2 OSCC Brake
29(1)
3.6.3 OSCC Steering
29(1)
3.7 Perceptln Chassis Software Adaptation Layer
30(4)
References
34(1)
4 Passive Perception with Sonar and Millimeter Wave Radar
35(12)
4.1 Introduction
35(1)
4.2 The Fundamentals of mmWave Radar
35(3)
4.2.1 Range Measurement
36(1)
4.2.2 Velocity Measurement
37(1)
4.2.3 Angle Detection
38(1)
4.3 mmWave Radar Deployment
38(3)
4.4 Sonar Deployment
41(4)
References
45(2)
5 Localization with Real-Time Kinematic Global Navigation Satellite System
47(30)
5.1 Introduction
47(1)
5.2 GNSS Technology Overview
47(2)
5.3 RTKGNSS
49(3)
5.4 RTK-GNSS NtripCaster Setup Steps
52(3)
5.4.1 Set up NtripCaster
52(2)
5.4.2 Start NtripCaster
54(1)
5.5 Setting Up NtripServer and NtripClient on Raspberry Pi
55(4)
5.5.1 Install the Raspberry Pi System
55(2)
5.5.2 Run RTKLIB-str2str on the Raspberry Pi
57(1)
5.5.2.1 Running NtripServer on the Base Station Side
57(1)
5.5.2.2 Running NtripClient on the GNSS Rover
58(1)
5.6 Setting Up a Base Station and a GNSS Rover
59(12)
5.6.1 Base Station Hardware Setup
59(1)
5.6.2 Base Station Software Setup
60(7)
5.6.3 GNSS Rover Setup
67(1)
5.6.3.1 Rover Hardware Setup
67(1)
5.6.3.2 Rover Software Setup
68(3)
5.7 FreeWave Radio Basic Configuration
71(4)
References
75(2)
6 Computer Vision for Perception and Localization
77(20)
6.1 Introduction
77(1)
6.2 Building Computer Vision Hardware
77(4)
6.2.1 Seven Layers of Technologies
78(2)
6.2.2 Hardware Synchronization
80(1)
6.2.3 Computing
80(1)
6.3 Calibration
81(4)
6.3.1 Intrinsic Parameters
81(1)
6.3.2 Extrinsic Parameters
82(1)
6.3.3 Kalibr
82(1)
6.3.3.1 Calibration Target
83(1)
6.3.3.2 Multiple Camera Calibration
83(1)
6.3.3.3 Camera IMU Calibration
84(1)
6.3.3.4 Multi-IMU and IMU Intrinsic Calibration
84(1)
6.4 Localization with Computer Vision
85(2)
6.4.1 VSLAM Overview
85(1)
6.4.2 ORB-SLAM2
86(1)
6.4.2.1 Prerequisites
86(1)
6.4.2.2 Building the ORB-SLAM2 Library
87(1)
6.4.2.3 Running Stereo Datasets
87(1)
6.5 Perception with Computer Vision
87(3)
6.5.1 ELAS for Stereo Depth Perception
88(1)
6.5.2 Mask R-CNN for Object Instance Segmentation
89(1)
6.6 The DragonFly Computer Vision Module
90(4)
6.6.1 DragonFly Localization Interface
90(2)
6.6.2 DragonFly Perception Interface
92(1)
6.6.3 DragonFly+
93(1)
References
94(3)
7 Planning and Control
97(22)
7.1 Introduction
97(1)
7.2 Route Planning
97(3)
7.2.1 Weighted Directed Graph
98(1)
7.2.2 Dijkstra's Algorithm
99(1)
7.2.3 A* Algorithm
100(1)
7.3 Behavioral Planning
100(5)
7.3.1 Markov Decision Process
101(1)
7.3.2 Value Iteration Algorithm
102(1)
7.3.3 Partially Observable Markov Decision Process (POMDP)
103(1)
7.3.4 Solving POMDP
104(1)
7.4 Motion Planning
105(2)
7.4.1 Rapidly Exploring Random Tree
105(1)
7.4.2 RRT*
106(1)
7.5 Feedback Control
107(3)
7.5.1 Proportional-Integral-Derivative Controller
108(1)
7.5.2 Model Predictive Control
108(2)
7.6 Iterative EM Plannning System in Apollo
110(6)
7.6.1 Terminologies
110(1)
7.6.1.1 Path and Trajectory
110(1)
7.6.1.2 SL Coordinate System and Reference Line
110(1)
7.6.1.3 ST Graph
111(1)
7.6.2 Iterative EM Planning Algorithm
112(1)
7.6.2.1 Traffic Decider
113(1)
7.6.2.2 QP Path and QP Speed
114(2)
7.7 PerceptIn's Planning and Control Framework
116(2)
References
118(1)
8 Mapping
119(16)
8.1 Introduction
119(1)
8.2 Digital Maps
119(6)
8.2.1 Open Street Map
120(1)
8.2.1.1 OSM Data Structures
120(1)
8.2.1.2 OSM Software Stack
121(1)
8.2.2 Java Open Street Map Editor
121(2)
8.2.2.1 Adding a Node or a Way
123(1)
8.2.2.2 Adding Tags
123(1)
8.2.2.3 Uploading to OSM
124(1)
8.2.3 Nominatim
124(1)
8.2.3.1 Nominatim Architecture
124(1)
8.2.3.2 Place Ranking in Nominatim
125(1)
8.3 High-Definition Maps
125(5)
8.3.1 Characteristics of HD Maps
126(1)
8.3.1.1 High Precision
126(1)
8.3.1.2 Rich Geometric Information and Semantics
126(1)
8.3.1.3 Fresh Data
126(1)
8.3.2 Layers of HD Maps
126(1)
8.3.2.1 2D Orthographic Reflectivity Map
127(1)
8.3.2.2 Digital Elevation Model
127(1)
8.3.2.3 Lane/Road Model
127(1)
8.3.2.4 Stationary Map
127(1)
8.3.3 HD Map Creation
127(1)
8.3.3.1 Data Collection
127(1)
8.3.3.2 Offline Generation of HD Maps
128(1)
8.3.3.2.1 Sensor Fusion and Pose Estimation
128(1)
8.3.3.2.2 Map Data Fusion and Data Processing
129(1)
8.3.3.2.3 3D Object Location Detection
129(1)
8.3.3.2.4 Semantics/Attributes Extraction
129(1)
8.3.3.3 Quality Control and Validation
129(1)
8.3.3.4 Update and Maintenance
129(1)
8.3.3.5 Problems of HD Maps
130(1)
8.4 Perceptln's π-Map
130(3)
8.4.1 Topological Map
130(1)
8.4.2 π-Map Creation
131(2)
References
133(2)
9 Building the DragonFly Pod and Bus
135(26)
9.1 Introduction
135(1)
9.2 Chassis Hardware Specifications
135(1)
9.3 Sensor Configurations
136(2)
9.4 Software Architecture
138(4)
9.5 Mechanism
142(2)
9.6 Data Structures
144(14)
9.6.1 Common Data Structures
144(2)
9.6.2 Chassis Data
146(3)
9.6.3 Localization Data
149(1)
9.6.4 Perception Data
150(3)
9.6.5 Planning Data
153(5)
9.7 User Interface
158(2)
References
160(1)
10 Enabling Commercial Autonomous Space Robotic Explorers
161(10)
10.1 Introduction
161(1)
10.2 Destination Mars
162(1)
10.3 Mars Explorer Autonomy
163(5)
10.3.1 Localization
163(1)
10.3.2 Perception
164(1)
10.3.3 Path Planning
165(1)
10.3.4 The Curiosity Rover and Mars 2020 Explorer
165(3)
10.4 Challenge: Onboard Computing Capability
168(1)
10.5 Conclusion
169(1)
References
170(1)
11 Edge Computing for Autonomous Vehicles
171(12)
11.1 Introduction
171(1)
11.2 Benchmarks
172(1)
11.3 Computing System Architectures
173(2)
11.4 Runtime
175(2)
11.5 Middleware
177(1)
11.6 Case Studies
178(1)
References
179(4)
12 Innovations on the Vehicle-to-Everything Infrastructure
183(8)
12.1 Introduction
183(1)
12.2 Evolution of V2X Technology
183(3)
12.3 Cooperative Autonomous Driving
186(2)
12.4 Challenges
188(1)
References
189(2)
13 Vehicular Edge Security
191(8)
13.1 Introduction
191(1)
13.2 Sensor Security
191(1)
13.3 Operating System Security
192(1)
13.4 Control System Security
193(1)
13.5 V2X Security
193(1)
13.6 Security for Edge Computing
194(2)
References
196(3)
Index 199
SHAOSHAN LIU, PHD, is Founder and CEO of PerceptIn, a full-stack visual intelligence company aimed at making scalable hardware/software integrated solutions for autonomous robotics systems. Liu holds a Ph.D. in Computer Engineering from University of California, Irvine and his research focuses on Edge Computing Systems, Robotics, and Autonomous Driving. Liu has over 40 publications and over 100 patents in autonomous systems. Liu is currently a Senior Member of IEEE, an ACM Distinguished Speaker, an IEEE Computer Society Distinguished Visitor, and a co-founder of the IEEE Computer Society Special Technical Community on Autonomous Driving Technologies.