tinySceneGraph

tinyScenegraphs v-ray bindings


Friends of tinySG,

with yafaray and embree, tinySG already implemented two ray tracer plugins years ago. However, the new v-ray plugin takes path-tracing in tinyScenegraph scenes to the next level. Using a full-featured SDK makes things so much easier and comfortable that it only took me about 3 weeks to integrate v-ray as a plugin into the tinySG scene editor, csgEdit.

Basic architecture

tinyScenegraphs plugin registry enables plugins to cooperate and communicate with each other. Plugins may provide their own user interfaces and access core scene control in csgEdit. This allows the v-ray plugin to access scene geometry, lighting information and material definitions within any loaded scene. It can also communicate with tinyCoiffeur to control hair/fur creation, define hair simulation parameters, trigger the simulator and fetch the results. This way it is pretty simple to enrich any scene with fur or hair, without the hair being part of the originally loaded scene.

The v-ray plugin may open it's own render window and synchronise the virtual camera with the camera of the regular viewer, pause rendering, assign resources like CPU cores to v-ray and define other parameters specific to the ray tracer. The raytrace viewer can display several render elements to show not only the final ray traced image, but also components, like diffuse or specular lighting, vertex normals, depth, transparency, etc. Picking inside the v-ray render window first links to v-ray scene nodes, and from there to the original tinySG scenegraph nodes. Any scene update on the tinySG side emits an event to all interested plugins. The v-ray plugin catches the event and live-updates the v-ray scene accordingly.

New features

  • Hair / fur: v-ray adds a 3rd way to render fur, next to the peviously implemented OpenGL-based linestrip and textured shell-based approaches. While the v-ray generated hair is much more realistically, it is also considerably slower to calculate.
    OpenGL-based fur, generated via instanced rendering and texture shells. OpenGL-based hair, generated with tessellation shaders. v-ray rendered fur, based on tinyCoiffeur hair simulation.

  • Global Illumination: Using a path tracer, all illuminance hitting a surface of the scene is taken into account, no matter whether it originates from light sources or is reflected/refracted from other objects in the scene. Images below have been shot in a Cornell box (red wall on the left, green wall on the right), using a pretty diffuse material:
    Path-tracing using reflections only. Path-tracing with GI enabled. GI render element only (= GI contribution).

  • Reflections: Ray tracing allows for physically correct reflections without any fake implementation, texture tricks, etc. It is easy to add mirrors to a scene, metals look much more realistic that OpenGL fakes via environment maps or light probes.

  • Advanced features: Looking forward, many advanced effects like caustics, refraction, subsurface scattering or displacement mapping are in reach. The main goal is to implement a Blender-like shading tree editor so that these effects can be parameterised and combined easily. As of today, there is no equivalent in the OpenGL renderer which is why such effects are hard to setup in csgEdit.

Intermediate results, july 2022

Initial efforts focused on supporting a complete set of pbr material parameters and various types of light sources (point, sphere, spot, directional, area and dome). The plugin takes geometry of various scene shape nodes, mainly indexed tri-, quad- and facesets, as well as material definitions from csgAdvancedMaterial nodes and converts them into v-ray plugin instances.

Once the basics were in place, fur was next as this is the most difficult part to render using classic OpenGL, and it is useful for many subtle effects beside rendering teddy bears.
To generate fur, the v-ray plugin teams up with the tinyCoiffeur hair simulator plugin to create base hair geometry and add it to the v-ray scene. The hair geometry is then outfitted with a BRDFHair3 material.

tinyCoiffeur "fluff mode"

A new extention to tinyCoiffeur allows to create pretty realistic, fluffy fabrics. The fluff can be adjusted with respect to length, thickness, curliness, color and density, as shown in the images on the right.

tinyCoiffeur does not simulate the fluff based on physical material parameters as it does for regular hair/fur. Instead, it applies stochastical deformations on tessellated surface normals, honoring angle, length and thickness limits defined by the user.

Future plans

The implementation of the v-ray plugin is really an intermediate project to a long running switch to a Vulkan renderer that is supposed to become the new standard interactive renderer core. Once that one is available, ray tracing HW on modern GPUs immediately becomes accessible for all kinds of algorithms, not necessarily focused rendering.

Keep rendering,
Christian



Acknowledgements and links:

  • All renderings were created with tinySGs scene editor, csgEdit, using the new v-ray plugin to render the images.
  • The bearskin hat was modeled in Blender, imported into tinySG, converted to a hair simulation using tinyCoiffeur plugin and finally rendered with v-ray 5.14.
  • The blue dress, trenchcoat (without fur) and mens shirt datasets have been simulated in Assyst Vidya, imported to tinySG, converted into v-ray native data and then rendered using v-ray.
  • All hdri environmaps are by courtesy of Poly Haven.
  • All hair/fur geometry was generated with tinyCoiffeur and rendered with v-ray.
  • The human models are by courtesy of Sooii GmbH.



Copyright by Christian Marten, 2009-2022
Last change: 30.07.2022