Foreword |
|
xvii | |
Preface |
|
xix | |
Acknowledgments |
|
xxi | |
About This Book |
|
xxiii | |
About The Authors |
|
xxviii | |
Author Online |
|
xxix | |
About The Cover Illustration |
|
xxx | |
Part 1 Getting Started |
|
1 | (30) |
|
|
3 | (28) |
|
1.1 Why support the Rift? |
|
|
4 | (1) |
|
The call of virtual reality |
|
|
4 | (1) |
|
|
4 | (1) |
|
1.2 How is the Rift being used today? |
|
|
5 | (4) |
|
1.3 Get to know the Rift hardware |
|
|
9 | (8) |
|
|
9 | (5) |
|
|
14 | (3) |
|
|
17 | (1) |
|
|
17 | (9) |
|
Using head tracking to change the point of view |
|
|
20 | (1) |
|
Rendering an immersive view |
|
|
21 | (5) |
|
1.5 Setting up the Rift for development |
|
|
26 | (1) |
|
1.6 Dealing with motion sickness |
|
|
27 | (2) |
|
|
29 | (1) |
|
|
29 | (2) |
Part 2 Using The Oculus C API |
|
31 | (110) |
|
2 Creating your first Rift interactions |
|
|
33 | (22) |
|
|
34 | (1) |
|
|
34 | (1) |
|
|
35 | (1) |
|
|
35 | (4) |
|
|
36 | (2) |
|
|
38 | (1) |
|
2.3 Getting input from the head tracker |
|
|
39 | (6) |
|
Reserving a pointer to the device manager and locating the headset |
|
|
42 | (1) |
|
|
43 | (1) |
|
Reporting tracker data to the console |
|
|
44 | (1) |
|
|
44 | (1) |
|
|
44 | (1) |
|
2.4 A framework for demo code: the GlfwApp base class |
|
|
45 | (2) |
|
2.5 Rendering output to the display |
|
|
47 | (5) |
|
The constructor: accessing the Rift |
|
|
50 | (1) |
|
Creating the OpenGL window |
|
|
51 | (1) |
|
Rendering two rectangles, one for each eye |
|
|
51 | (1) |
|
|
52 | (1) |
|
|
53 | (2) |
|
3 Pulling data out of the Rift: working with the head tracker |
|
|
55 | (17) |
|
|
56 | (5) |
|
Enabling and resetting head tracking |
|
|
56 | (1) |
|
Receiving head tracker data |
|
|
57 | (4) |
|
3.2 Receiving and applying the tracker data: an example |
|
|
61 | (5) |
|
Initial setup and binding |
|
|
64 | (1) |
|
|
65 | (1) |
|
Applying the orientation to the rendered scene |
|
|
65 | (1) |
|
3.3 Additional features: drift correction and prediction |
|
|
66 | (5) |
|
|
67 | (1) |
|
|
67 | (3) |
|
Using drift correction and prediction |
|
|
70 | (1) |
|
|
71 | (1) |
|
4 Sending output to the Rift: working with the display |
|
|
72 | (28) |
|
4.1 Targeting the Rift display |
|
|
73 | (7) |
|
Extended vs. Direct HMD mode |
|
|
73 | (1) |
|
Creating the OpenGL window: choosing the display mode |
|
|
74 | (1) |
|
Creating the OpenGL window: Extended Desktop mode |
|
|
74 | (3) |
|
Creating the OpenGL window: Direct HMD mode |
|
|
77 | (2) |
|
Full screen vs. windowed: extensions with glftoCreateWindow() |
|
|
79 | (1) |
|
Dispensing with the boilerplate |
|
|
80 | (1) |
|
4.2 How the Rift display is different: why it matters to you |
|
|
80 | (5) |
|
Each eye sees a distinct half of the display panel |
|
|
81 | (2) |
|
How the lenses affect the view |
|
|
83 | (2) |
|
4.3 Generating output for the Rift |
|
|
85 | (2) |
|
4.4 Correcting for lens distortion |
|
|
87 | (11) |
|
The nature of the distortion |
|
|
88 | (2) |
|
SDK distortion correction support |
|
|
90 | (1) |
|
Example of distortion correction |
|
|
90 | (8) |
|
|
98 | (2) |
|
5 Putting it all together: integrating head tracking and 3D rendering |
|
|
100 | (25) |
|
|
102 | (2) |
|
5.2 Our sample scene in monoscopic 3D |
|
|
104 | (2) |
|
|
106 | (6) |
|
Verifying your scene by inspection |
|
|
109 | (3) |
|
5.4 Rendering to the Rift |
|
|
112 | (9) |
|
Enhanced data for each eye |
|
|
114 | (2) |
|
|
116 | (1) |
|
Setting up the SDK for distortion rendering |
|
|
117 | (1) |
|
The offscreen framebuffer targets |
|
|
117 | (1) |
|
The Oculus texture description |
|
|
118 | (2) |
|
Projection and modelview offset |
|
|
120 | (1) |
|
The Rift's rendering loop |
|
|
121 | (1) |
|
|
121 | (3) |
|
Implications of prediction |
|
|
123 | (1) |
|
Getting your matrices in order |
|
|
123 | (1) |
|
|
124 | (1) |
|
6 Performance and quality |
|
|
125 | (16) |
|
6.1 Understanding VR performance requirements |
|
|
126 | (1) |
|
6.2 Detecting and preventing performance issues |
|
|
127 | (2) |
|
6.3 Using timewarp: catching up to the user |
|
|
129 | (3) |
|
Using timewarp in your code |
|
|
130 | (1) |
|
|
130 | (2) |
|
|
132 | (1) |
|
6.4 Advanced uses of timewarp |
|
|
132 | (3) |
|
When you're running early |
|
|
132 | (2) |
|
|
134 | (1) |
|
6.5 Dynamic framebuffer scaling |
|
|
135 | (5) |
|
|
140 | (1) |
Part 3 Using Unity |
|
141 | (44) |
|
7 Unity: creating applications that run on the Rift |
|
|
143 | (21) |
|
7.1 Creating a basic Unity project for the Rift |
|
|
145 | (2) |
|
Use real-life scale for Rift scenes |
|
|
145 | (1) |
|
Creating an example scene |
|
|
146 | (1) |
|
7.2 Importing the Oculus Unity 4 Integration package |
|
|
147 | (2) |
|
7.3 Using the Oculus player controller prefab: getting a scene on the Rift, no scripting required |
|
|
149 | (4) |
|
Adding the OVRPlayerController prefab to your scene |
|
|
149 | (1) |
|
Doing a test run: the Unity editor workflow for Rift applications |
|
|
150 | (2) |
|
The OVRPlayerController prefab components |
|
|
152 | (1) |
|
7.4 Using the Oculus stereo camera prefab: getting a scene on the Rift using your own character controller |
|
|
153 | (7) |
|
The OVRCameraRig prefab components |
|
|
157 | (3) |
|
7.5 Using player data from the user's profile |
|
|
160 | (1) |
|
Ensuring the user has created a profile |
|
|
160 | (1) |
|
7.6 Building your application as a full-screen standalone application |
|
|
161 | (2) |
|
|
163 | (1) |
|
8 Unity: tailoring your application for the Rift |
|
|
164 | (21) |
|
8.1 Creating a Rift-friendly UI |
|
|
165 | (6) |
|
Using the Unity GUI tools to create a UI |
|
|
165 | (6) |
|
|
171 | (1) |
|
8.2 Using Rift head tracking to interact with objects |
|
|
171 | (7) |
|
Setting up objects for detection |
|
|
173 | (1) |
|
Selecting and moving objects |
|
|
174 | (2) |
|
Using collision to put the selected object down |
|
|
176 | (2) |
|
8.3 Easing the user into VR |
|
|
178 | (2) |
|
Knowing when the health and safety warning has been dismissed |
|
|
178 | (1) |
|
Re-centering the user's avatar |
|
|
179 | (1) |
|
|
180 | (1) |
|
8.4 Quality and performance considerations |
|
|
180 | (4) |
|
Measuring quality: looking at application frame rates |
|
|
180 | (1) |
|
|
181 | (2) |
|
(Not) Mirroring to the display |
|
|
183 | (1) |
|
Using the Unity project quality settings |
|
|
183 | (1) |
|
|
184 | (1) |
Part 4 The VR User Experience |
|
185 | (74) |
|
|
187 | (41) |
|
9.1 New UI paradigms for VR |
|
|
189 | (13) |
|
UI conventions that won't work in VR and why |
|
|
190 | (3) |
|
Can your world tell your story? |
|
|
193 | (5) |
|
Getting your user from the desktop to VR |
|
|
198 | (1) |
|
|
199 | (3) |
|
9.2 Designing 3D user interfaces |
|
|
202 | (13) |
|
|
203 | (1) |
|
Guidelines for 3D scene and UI design |
|
|
204 | (4) |
|
The mouse is mightier than the sword |
|
|
208 | (6) |
|
Using the Rift as an input device |
|
|
214 | (1) |
|
9.3 Animations and avatars |
|
|
215 | (5) |
|
Cockpits and torsos: context in the first person |
|
|
216 | (2) |
|
|
218 | (2) |
|
9.4 Tracking devices and gestural interfaces |
|
|
220 | (7) |
|
|
220 | (4) |
|
|
224 | (3) |
|
|
227 | (1) |
|
10 Reducing motion sickness and discomfort |
|
|
228 | (31) |
|
10.1 What does causing motion sickness and discomfort mean? |
|
|
229 | (1) |
|
10.2 Strategies and guidelines for creating a comfortable VR environment |
|
|
230 | (24) |
|
Start with a solid foundation for your VR application |
|
|
231 | (1) |
|
Give your user a comfortable start |
|
|
231 | (1) |
|
The golden rule of VR comfort: the user is in control of the camera |
|
|
232 | (1) |
|
Rethink your camera work: new approaches for favorite techniques |
|
|
233 | (4) |
|
Make navigation as comfortable as possible: character movement and speed |
|
|
237 | (3) |
|
Design your world with VR constraints in mind |
|
|
240 | (2) |
|
Pay attention to ergonomics: eyestrain, neck strain, and fatigue |
|
|
242 | (3) |
|
Use sound to increase immersion and orient the user to action |
|
|
245 | (1) |
|
Don't forget your user: give the player the option of an avatar body |
|
|
245 | (1) |
|
Account for human variation |
|
|
246 | (4) |
|
Help your users help themselves |
|
|
250 | (1) |
|
Evaluate your content for use in the VR environment |
|
|
250 | (3) |
|
Experiment as much as possible |
|
|
253 | (1) |
|
10.3 Testing your VR application for motion sickness potential |
|
|
254 | (2) |
|
Use standardized motion and simulator sickness questionnaires |
|
|
254 | (1) |
|
Test with a variety of users and as many as you can |
|
|
254 | (1) |
|
|
255 | (1) |
|
Test with users who have set their personal profile |
|
|
255 | (1) |
|
|
255 | (1) |
|
Test in different display modes |
|
|
255 | (1) |
|
|
256 | (3) |
Part 5 Advanced Rift Integrations |
|
259 | (108) |
|
11 Using the Rift with Java and Python |
|
|
261 | (39) |
|
11.1 Using the Java bindings |
|
|
262 | (24) |
|
Meet our Java binding: JO'VR |
|
|
264 | (4) |
|
The Jocular-examples project |
|
|
268 | (2) |
|
|
270 | (14) |
|
|
284 | (2) |
|
11.2 Using the Python bindings |
|
|
286 | (12) |
|
Meet our Python binding: PyOVR |
|
|
287 | (1) |
|
|
287 | (1) |
|
The pyovr-examples project |
|
|
287 | (1) |
|
|
287 | (10) |
|
|
297 | (1) |
|
11.3 Working with other languages |
|
|
298 | (1) |
|
|
299 | (1) |
|
12 Case study: a VR shader editor |
|
|
300 | (34) |
|
12.1 The starting point: Shadertoy |
|
|
301 | (2) |
|
12.2 The destination: ShadertoyVR |
|
|
303 | (1) |
|
12.3 Making the jump from 2D to 3D |
|
|
303 | (9) |
|
|
303 | (2) |
|
|
305 | (2) |
|
|
307 | (1) |
|
|
307 | (1) |
|
|
308 | (2) |
|
Windowing and UI libraries |
|
|
310 | (2) |
|
|
312 | (9) |
|
Supporting the Rift in Qt |
|
|
313 | (7) |
|
Off-screen rendering and input processing |
|
|
320 | (1) |
|
12.5 Dealing with performance issues |
|
|
321 | (3) |
|
12.6 Building virtual worlds on the GPU |
|
|
324 | (8) |
|
Raycasting: building 3D scenes one pixel at a time |
|
|
325 | (2) |
|
Finding the ray direction in 2D |
|
|
327 | (1) |
|
Finding the ray direction in VI? |
|
|
328 | (2) |
|
Handling the ray origin: stereopsis and head tracking |
|
|
330 | (1) |
|
Adapting an existing Shadertoy shader to run in ShadertoyVR |
|
|
331 | (1) |
|
|
332 | (2) |
|
13 Augmenting virtual reality |
|
|
|
13.1 Real-world images for VR: panoramic photography |
|
|
334 | (6) |
|
|
335 | (1) |
|
|
336 | (2) |
|
Photo spheres...in space! |
|
|
338 | (2) |
|
13.2 Using live webcam video in the Rift |
|
|
340 | (10) |
|
Threaded frame capture from a live image feed |
|
|
342 | (4) |
|
|
346 | (1) |
|
Proper scaling: webcam aspect ratio |
|
|
347 | (1) |
|
Proper ranging: field of view |
|
|
348 | (1) |
|
|
348 | (2) |
|
|
350 | (3) |
|
Stereo vision in our example &de |
|
|
351 | (1) |
|
Quirks of stereo video from inside the Rift |
|
|
352 | (1) |
|
13.4 The Leap Motion hand sensor |
|
|
353 | (13) |
|
Developing software for the Leap Motion and the Rift |
|
|
355 | (1) |
|
The Leap, the Rift, and their respective coordinate systems |
|
|
356 | (1) |
|
Demo: integrating Leap and Rift |
|
|
357 | (9) |
|
|
366 | (1) |
Appendix A Setting Up The Rift In A Development Environment |
|
367 | (14) |
Appendix B Mathematics And Software Patterns For 3d Graphics |
|
381 | (9) |
Appendix C Suggested Books And Resources |
|
390 | (4) |
Appendix D Glossary |
|
394 | (4) |
Index |
|
398 | |