Typical use cases for particles include:
Particle Data and State¶
Particles are discrete points in space with a number of associated attributes (data layers). A few attributes are found in all particle systems, like:
- Index (a uniquely identifying number for each particle)
Other attributes may or may not exist for a given particle set, depending on its uses and requirements, such as:
- Orientation and Scale, e.g. for instancing
- Velocity and Angular Velocity for physical particle simulation
- Object Reference: an identifier for instancing or rigid body shapes (actual pointers are stored separately in a lookup table)
- Color can be ascribed to each particle for the rendering purposes (Cosmos Laundromat tornado uses colored particles)
- Other physical properties may be used by specialized simulations: Size, Heat, Charge, ...
Most particle attributes can change over time. Any concrete set of particles can therefore only be a snapshot for a particular frame. This is called a particle “state”.
The scene in Blender stores a “current state” of particles, which relates to the “current frame” of that scene in the viewport. But other states may be stored or used temporarily:
- Renderers can request a scene state for any arbitrary frame. For motion blur this also includes subframes.
- Simulating over a frame range generates a sequence of states. In current Blender sims the scene frame is actively changed with each iteration, but particles could be calculated on their own with a local state without touching the scene state.
Caches are commonly used to store each frame’s state during simulation. The scene can then use the cache to look up the “current” state efficiently.
Extra Topology Elements¶
topology (mesh as particles), edge data (SPH)
Data Context and Node Levels¶
The node workflows in this chapter will be dealing with a single particle component only. The output node’s meaning is umambiguous then, because the particle component forms the context for all data access.
The workflow on the object level must somehow be separated from the particle level, lest it becomes too complicated to assign data flow to a specific component. This could be achieved by treating particle operations in a sub-tree of the nodes, like a node group.
Workflow Examples: Emitting Particles¶
Generate Plain Particles¶
Initializing new particles¶
Combining and Splitting Particle Sets¶
Using “Particles” input multiple times creates some ambiguity: then there are multiple sets of particles with the same indices (i.e. “same particles”). Modifications to the particle state then depend on the order in which these are plugged into the output node. Would be nice to solve this, but could also work alright this way if users are aware of it.
“Filtering” could be a general mechnism, whereby nodes first split particles, modify one of the branches, and then rejoin the two branches.
Distributing particles on a mesh surface¶
Distributing particles in a volume¶
Samples inside a volume don’t come with weights like surface samples. Tracking positions a volume is more ambiguous than tracking a mesh surface and requires support by a physics solver system. See Workflow Examples: Simulating Particles for examples.
Tracking mesh surfaces is easy because the surface is defined by the vertices. Every point on the surface is a linear combination of vertex position vectors (or other attributes), so all we need to do to reconstruct a point on a deformed surface is to store the weights per vertex.
Most volumes don’t have a linearization equivalent to mesh surfaces, so there is no direct mapping to a “deformed” volume. Volumetric simulations use integrators to advect particles through a gradient field iteratively.
Workflow Examples: Rendering Particles¶
Particles are an incredibly flexible tool for controlling renderable entities in a scene. Particles themselves are not actually renderable due to their point-like nature. They serve as the basis for other effects to produce renderable geometry.
Rendering for particles usually involved generating external data which is not part of the particle component itself (meshes, duplis, volumes). Therefore these node workflows are situated in the higher-level object nodes, rather than the particle nodes themselves (see also Data Context and Node Levels).
Billboards consist of a simple quad faces generated for each particle. They typically are facing the camera, which provides a cheap way to render uniform “blobs” of matter. The most efficient implementation of billboards is probably through mesh faces.
Fluid Surface Generation¶
A more sophisticated method of creating a mesh out of particle data, especially for simulating liquids. Each particle is surrounded by falloff function, the sum of all particle functions defines an implicit surface. Level Set methods can be used to discretize this surface. Thin sheets of fluid can be handled with methods such as [MUS14].
- “Particle Surface” takes a particle input and outputs a mesh.
- “Point Density” node outputs a special volumetric component, which is renderable.
- Different point density features such as color and falloff may be defined through inputs.
Integration into the compositing workflow is unclear
Workflow Examples: Simulating Particles¶
Simulating particles as point masses is a comparatively cheap way of producing physical motion. Collision with other objects is limited to the particle center, and self-collision is not possible efficiently. This limits the usefulness of point mass simulation, but it can still serve a purpose in motion graphics, and is included here for its very simplicity.
- “Simulate Points” node changes only the particle position (no rotational dynamics).
- Collision in this case is one-way only: Particles can collide with meshes in the scene, but will not have any effect in turn on other objects. For two-way interaction between objects a fully fledged rigid body simulation must be used.
- “Define Rigid Body” registers a rigid body object with the Bullet physics engine. After the physics step the rigid body’s location and rotation are then copied back to the particle.
- The collision shape for rigid bodies can be either a mesh or an implicit primitive shape. Primitive shapes are useful for massive simulations with thousands of colliding objects where full mesh collision would be too costly. Here we use an external object reference to define the shape.
- For rendering an instancing node is very suitable in this case. It can use the same object as the collision shape (or a more detailed version) to make physics and visuals match.
Fluid Simulation with Particles¶
Modern fluid VFX in movie productions and the like is almost exclusively of a “lagrangian” type, meaning that the rendering is based on particle/mesh data rather than directly using density grids. Particles are used as “markers” which are carried along (advected) with the fluid and thus “track” the fluid surface. This approach has the advantage of being very efficient, as well as allowing much more visual detail than would be possible with grids alone. Grids are still an indispensable part of the simulation, but they are used in conjunction with particles to utilize the best of both worlds.
Smoothed Particle Hydrodynamics (SPH) is not very useful for simulation purposes in CG. The computational cost is far too great compared to modern lagrangian methods such as FLIP. In it’s current implementation in Blender it also tends to become unstable quickly. It should therefore be considered of only theoretical interest.
Workflow Examples: Events¶
Limiting Particle Lifetime¶
Deleting Particles on Collision¶
Here could be some cases of editing a single particle state as well as potential cache editing features.