OpenVR support for HTC Vivethe most interesting new technology released to the graphics community in 2016 is probably virtual reality hardware. Foremost, VR headsets like the Oculus Rift or HTC Vive became affordable, offer a decent quality and are a really astonishing experience when you try them for the first time. The future promises to get even more exiting with hardware maturing and display resolutions increasing further. Augmented reality is about to enter the broader market as well, opening an additional, wide field of new applications. tinySG's scene editor, csgEdit, is capable of loading plugins and provides access to 3D scene data, the rendering core and it's UI via an SDK. A new member in the family of plugins is a binding to OpenVR, the main VR programming interface for the HTC Vive (and interfacing the Oculus Rift as well).
This article describes, from a birds perspective, the integration of OpenVR
in tinySG, as well as some of the design decisions and integration pitfalls.
You'll also find comments on performance and hardware configurations. If
you are more interested in a programming tutorial on OpenVR, I'd recommend
to download the OpenVR sdk and take a look at the sample programs.
OpenVROpenVR is an API to access VR hardware, developed mainly by Valve. It aims at sending frames to the hmd, receive positional data from the tracking system, manage input data from controllers, a tracked camera or, in general, tracked devices. Furthermore, you can control a room-scale play area and utilise rendering features like overlays. OpenVR splits all these tasks across several C++ interface objects, like
Tracking / pollingOpenVR uses a polling approach for most things you want to do:
Submitting frames and lense distortionThe main render loop basically does the following (callback logic and camera stuff omitted...):while (vrSessionActive) { // Trigger tinySG scene rendering into FBOs/textures, for both eyes: vrViewer->RenderFrame(); // Submit frame textures to OpenVR compositor: vrViewer->Present(); // Poll all pending events from OpenVR and schedule events to other // threads for processing: HandleInput(); // Fetch new poses for tracked devices. This function will block if loop // runs ahead of current frames timestamp: vrViewer->UpdateHMDMatrixPose(); }The important thing to note is that all tracking pposes should be as recent as possible to avoid any lag. You can even use exact time stamps and motion vectors to extrapolate tracking of devices to the right point in time.
OpenVR takes care of fiddling around with the messy lense distortion of
your hmd. Although it is possible to setup distortion yourself, tinySG
never made the effort as the build-in distortion works pretty well.
The ViveVR pluginAll OpenVR code is implemented in the ViveVR plugin for csgEdit.VR must avoid any lag at all cost and maintain a very high framerate. 90Hz is imperative to avoid simulator sickness. You just cannot afford to do any application tasks in between the render calls. The ViveVR plugin therefore creates a separate render thread running the minimalistic render loop as shown above, focusing only on getting information from the tracking system, updating controller positions and camera transformation accordingly and render the scene as fast as possible. Everything else is done in other threads:
Synchronisation support between threads is limited, but flexible in tinySG. The easiest way to synchronise application logic with the render thread is to provide a synchronous callback, executed within the VR render loop right after render traversal is completed, and swap the scenes root node. This basically double-buffers all scene data. An alternative would be to utilise tinySG's multi-CPU support, basically managing a separate set of OpenGL objects per GPU (even if there is only one GPU!). This way, you'd have only one copy of the scenegraph in main memory, but each scene node maintains two sets of GPU objects (VAO, textures, etc) - one for each (virtual) GPU. The application logic operates on the C++ scene objects and triggers GPU updates on buffers associated with, say, pipe 0, while the render loop operates on the other set associated with pipe 1. Then simply swap the pipeID in the render loop to switch between GPU objects when all updates for the next frame are ready. This works, because OpenGL contexts are shared in csgEdit.
An even simpler approach could traverse the scene twice and rely on
OpenGL processing commands asynchronously: In the 1st traversal just
push data updates to GPU memory, in the 2nd traversal submit the actual
render calls, in hope the data has arrived in the meantime. This may
also be just good enough...
Other plugins using VRA bunch of other plugins emerged over the last two years that utilise the COM interface of ViveVR to display their content in a virtual, tracked environment:
tinyCoiffeur is a plugin designed to simulate hair. It provides
tools like scissor, brush or dryer to style a hair cut. While you can
operate it with a spacemouse device on a normal 2D monitor, the real fun
starts when entering an VR environment and taking the controllers into
action: You see a virtual person in real life dimensions sitting in front
of you, have the simulation engine responding to your actions and you
can style her haircut using the Vive controllers. The best part is that
nobody will complain even if you completely ruin the hair cut :)
El Capitan is a demo building on top of images from a 3D VR experience by Mammut: #project360.
tinySG takes 360° images of Mammuts climbing tours and
imports them into a scene with multiple viewer postions, represented
by cubemaps. A Python script listens for events from the Vive controllers
and teleports the viewer in between these locations. Technically, this application is interesting, because there is only a single cubemap for each location - and not one per eye. The effect is nevertheless amazing, using just different projection matrices for left and right eye into the cubemap.
Finally, ViveVR can capture any scene loaded into csgEdit
and send it to a connected headset for viewing in virtual reality.
VR hardware requirementsSo, is your PC fast enough to run your own VR experiments? Before buying the Vive I was pretty concerned that my hardware would fail to render scenes without causing motion sickness. Valve provides a tool that evaluates your systems performance and reports whether it is fast enough or not. My PC with an AMD Hawaii chip and a hexacore Sandy Bridge CPU officially failed to meet the minimum requirements.
If you are in a similar situation, don't worry! While such an ancient hardware
configuration stutters pretty much when running scenes like the Vive home
living room, it is still fast enough to experiment with the technology or run
simpler applications like The Lab. I also used a laptop with an
nVidia Geforce 1050, and it worked well enough for small scenes.
OutlookThere is much more to explore. nVidia drivers offer an interesting OpenGL extension called simultaneous multiprojection, which basically avoids rendering the same scene twice - for the left and for the right eyes perspective. Instead, you simply setup the extension, providing two projection matrices, and deal with a single pass in the shader.The same strategy should be possible even without an nVidia-specific extension: You could provide two projection matrices in a uniform buffer object, render your scenes with an instance count of 2, and manage two render targets in parallel. Then simply direct the first instance to the left eye target, and the second instance to the right eye target. With upcoming mixed reality hardware, augmenting real environments with virtual content should become possible. And the upcoming Vive Pro is supposed to have a stereo camera, which migt be used for similar effects. Interesting times these are...
Keep rendering,
Copyright by Christian Marten, 2018 Last change: 17.03.2018 |