tinySG meets voxelsFriends of tinySG,some of the perhaps most interesting render algorithms have finally been ported from csg to tinySG - volume rendering and vector field visualisation. Images look pretty different from the usual polygon data visualisation you are familiar with from games or engineering/design software.
Datasets can be explored easily in tsgEdit via transfer functions and
intersection planes. The image to the right shows csgEdits volume node
inspector. On it's left side, controls determine which texture units are to
be used for volume data and transfer function lookup tables as well as which
render mode to choose. The middle section provides access to pre-defined
and previously saved transfer functions. On the right side, the drag editor
shows a histogram of the volume and allows to insert or drag control points
of the transfer functions.
Several build-in functions allow to quickly find interesting spots in the volume
for closer inspection.
Rendering just axis-aligned slice planes allows to browse through the voxels much like through a flip-book. Though not supported by csgEdit, the renderer naturally supports arbitrary slice planes, too. By using the new Dicom import filter, it is possible to stack 2D medical images, generated by e.g. ultrasonic or magnet resonance devices, into a 3D volume. The screenshot to the right shows a series of 39 MRT greyscale images, imported into a 3D texture and rendered by a csgVolume node with the frozen fire transfer preset. Volumetric datasets become fascinating to the very limits when viewed on large scale stereographic projection systems. The eqcsg cluster layer for tinySG is capable to render these datasets on active as well as passive stereo, multi-projector display walls. Upcoming applications include simulating atmospheric effects like volumetric fog or clouds, as well as visualising tolerances of engineering samples or CFD data. Hopefully, I'll find enough time and good datasets to give you an update on these ideas in the future. Continue reading with the journey inside the human body.
Keep rendering, Acknowledgements: The Skull, engine and bonsai datasets were downloaded from some web page in the past. Unfortunately, I can neither remember what that page was, nor who to give credits for them. The Dicom data has been kindly provided by the university of Hannover, Germany. It is an anonymous dataset of a magnet resonance checkup. The Mummy dataset has been generated by the British Museum, London. It has been part of the exhibition Mummy: The Inside Story |