tinySceneGraph

tinyParticles


Friends of tinySG,

Particles are everywhere. Have you seen the demo trailers on the Unreal-4 engine, with all the sparks dancing around the dark lord in his ice cave? Have you ever wondered how people managed to create realistic-looking smoke or fire? What is the best way to simulate the behaviour of cloth, fluids or hair?

Most often, if not always, physics is the key to realism. If a smoking fire does not behave like you expect it to behave, it does not look real. The technique behind the scenes is usually based on a particle system: Hundreds or thousands of little points in space, following the laws of physics.

Introduction

The image above shows 400 particles in a n-body simulation. Each particle has a mass and a velocity vector, and is bound to every other particle by gravity. Every animation frame, time advances a bit and particles move a small distance according to their velocity. Gravity is the driving force behind velocity changes as it accelerates particles towards one another.
Thanks to Sir Isaac Newton, we can calculate the force on a given particle P1 caused by another particle P2 like this:
   F = G * mass(P1)*mass(P2) / distance (P1,P2)².
   
G is the gravitational constant, a very, very small number. The locations and masses are given as input parameters, so the formula above provides the attracting forces for each particle.

Now, using the 2nd law of mechanics, the acceleration a of a particle can be calculated from the forces and the particles mass

      F = m*a   <=>   a = F/m
   
So, we know a particles acceleration at a given point in time, but need to get hands on it's position for rendering. Newton helps us out again: Acceleration is a change of velocity, and velocity is a change of position. Mathematically speaking, the function calculating a particles acceleration at a given point in time t is the 2nd order derivative of the function providing the particles location at t.

To get the location function, we need to integrate the acceleration function twice. Unfortunately, nobody has yet discovered how to do this analytically for more than 3 particles. But: We have a computer and can just solve the equations numerically, advancing one time step after the other from a given start configuration of particles. Runge-Kutta methods come to mind for solving initial value problems. This is what particle systems are all about.

The chart illustrates CPU scalability of the n-body simulation on two different machines. The score is measured in "simulated particle moves per second", so higher values are better.

The smaller one is a laptop with a quad core i7-4700MQ, running at 2.4 GHz. The larger one is a hexacore i7-3960X, running at 3.3GHz. Both processors have hyper-threading enabled.

The optimised code variant is fully parallelised for both integration and relaxation steps, uses a producer-consumer pattern and persistent thread pools instead of creating new threads every simulation step. In the diagram, the orange 4-core sample seems to be a glitch in the measurements. In general, it is remarkable how well hyper-threading works for this payload if the code is optimised. Even the quad-core delivers almost twice the performance when adding another four threads to run on the four virtual cores (grey line). Performance starts to drop if available cores are over-subscribed with threads.

Constraints

The n-body problem highlighted above is a pretty simple simulation (although it's complexity is O(n²)). Particles move without any constraints, just driven by gravity. However, you may want to simulate long hair instead of planets. In that case, you may think of a hair as a polyline, made of n points (or particles). There are similarities and differences to the n-body simulation:
  • Forces: Wind and gravity influence the particles, like in the n-body case. They move the particles.
  • Constraints: Particles are not entirely free to move wherever they like, but need to maintain the distance to their neighbours in a polyline. Some particles are even fixed (at the hairs root) and don't move at all. Thus, particles are constrained.
The general idea to animate a constrained particle system is a two-step approach: Move the particles according to forces in the first step, then correct their positions according to all constraints. For the hair case, move pairs of particles so that their initial distance remains constant. The first step is called integration, the second step relaxation.

So far, we talked about points (gravity) and polylines (hair). A natural extension are meshes. Meshes are simply "2D-hairs", introducing more constraints on a given particle. Possible applications are deformations of soft bodies, movement of cloths and the like.

tinyParticles framework

Particle systems often share a common approach for integration over time and animations, but differ in physics providing the foundation of the specific simulation. tinyParticles thus provides a framework for UI components (xml description of the parameters of a simulation), time step management and multi-threading support. A particular simulation provides objects implementing specific math for the integration- and relaxation-steps.

To aid UI-tasks, the framework comes as a plugin for the tinySG scene editor, creating regular scenegraph nodes. The particle system thus operates on regular scenes and receives callbacks once per frame to apply any updates the simulation has worked out. csgEdit's rendering takes over from there to provide user interaction, frame post processing, stereoscopic rendering, multi-GPU support, driving large projection systems or even PC-clusters.

tinyParticles provides services as base classes. To use them, a simulation has to derive it's objects from these classes. For example, the base class TPParticle manages a particles mass and it's two last positions. Since the time difference between two steps is known, the particle's two positions implicitly define it's velocity as well. The system itself uses the last two positions to provide a Verlet integration scheme for particle movement.

Objects are free to use the systems particle management services or do selected parts on their own. They may decide to either use worker threads on the CPU provided by the framework, or choose to implement a kernel running on the GPU.

The image to the left shows the tinyWeather plugin as an application using the framework, animating 10,000 particles at 100+ Hz on a GT 750M GPU: The plugin connects to a web weather service like OpenWeatherMap.org and retrieves current wind information and forecast data for a selected area from there. The wind data (velocity and direction) is downloaded into a rg-float texture to allow easy access for shaders and benefit from hardware interpolation. Particle data like position or particle age are maintained in a GL buffer object to have both read and write access. Positions are updated in the vertex shader during regular rendering, based on wind data from the wind texture. Next, a geometry shader generates lines from particle positions and wind directions. Finally, the fragment shader encodes the wind speed as colors in hsv color space.
The plugin was inspired by the virtual Volvo Ocean Race. The map shows real trajectory data east of Madagascar for both virtual and real yachts (click image to see a larger version).

If you are interested on GPU-driven simulations, stay tuned for an upcoming tech-guide about compute shaders on tinysg.de.

Keep rendering,
Christian



Acknowledgements and links:

  • Thomas Jakobsen has written an excellent paper on implementing particles systems. Search for an article called Advanced Character Physics.
  • See the article in Wikipedia on gravitation for a mathematical background of the n-body simulation.
  • For those who prefer to make simple things look complicated, I suggest to have another look at differential equations in Wikipedia and start reading from there.
  • The Volvo Ocean Race offers a free-to-play virtual regatta, competing with the real yachts taking part in the race. Virtual weather conditions are updated twice a day and match the real weather.
  • An even more professional 3D weather forecast service is available on meteoearth.com.



Copyright by Christian Marten, 2009-2014
Last change: 16.11.2014