Krakatoa
RSS< Twitter< etc

Camera Mapping Particles Using Dataflow

Camera Mapping Particles Using Dataflow

 

Introduction

This tutorial demonstrates how to implement Camera Projection Mapping using only Box #3 or alternatively Script Operators.

Many users have asked about how to achieve the sticky camera mapping effect seen on one of the most prominent Krakatoa demos - the blue Mini Cooper from The Italian Job dissolving in a turbulent wind. The truth is that an in-house Frantic Films Camera Mapping shader was used to project the rendering from a 3rd party renderer (Brazil r/s 1.x) onto Particle Flow particles rendered in Krakatoa. This map has the capability to use a Mapping Channel as source data, so we baked the initial positions of the particles into a UVW Channel and used those positions throughout the animation. As a result, each particle shaded in the color of the original pixel it would have had if it hadn't been moved by a force. In other words, the original color would stick to the particle as it would move due to Particle Flow forces.

The following tutorial implements exactly this behavior, but without using a custom Camera Projection Map!

The Theory

Here is what we would like to do:

  • We will create a simple object (like a teapot) and render it using Default Scanline renderer to an EXR image.
  • We will create a simple plane and will map the rendered image onto it - this will be our source image.
  • We will create a Particle Flow emitting particles from the same geometry object (teapot).
  • We will bake the 3D world space position of the particles on the first frame into a UVW Map Channel using a Data Operator which will be evaluated just once when the particles are born.
  • We will create a second Data Operator which will perform the following operations:
    • Get the Camera, read the Transformation Matrix and Invert it.
    • Read the position data from the UVW Map Channel used in the previous operator.
    • Multiply the position by the inverted Camera TM - this will give us the particle position in camera space.
    • We will calculate the screen coordinates of the particle based on the FOV angle of the camera
    • We will feed this position in local space into a Geometry sub-operator to find a point on the surface of the Plane we created corresponding to the screen position
    • We will sample the color from the surface of the plane corresponding to that screen position
    • We will feed this color into the Vertex Color Map channel

The Implementation

The Scene

You can download the 3ds Max 9 Data Flow scene.
This file requires the Orbaz Box #3 plug-in to be installed.

  • Create a teapot in the scene.
  • Place some objects around it.
  • Assign a Standard material to the teapot.
    • Assign a Checked map to the Diffuse map channel, set colors to blue and yellow.
    • Assign an instance of the same map to the Specular Level map channel.
    • Assign a Raytrace map to the Reflection slot. Set Map Amount to 50.0.
  • Render with Default Scanline at 640x480. Save as OpenEXR to disk. The image would look like:

  • Create a default Particle Flow
    • Set the Viewport Display % to 1 to speed up view redraws.
    • Add Position Object operator and pick the Teapot as the emitter surface.
    • Set Birth Start and End to frame 0.
    • Set Birth Amount to 2 million particles.
    • Remove the Speed, Rotation and Shape operators.
    • Add a Force Operator, create a Wind with a bit turbulence as shown in the introductory Krakatoa Tutorials.

  • Create a Plane primitive at the world origin.
    • Set its Width to 2.0 and height to 1.5 (because we used 640x480 which has an image aspect of 1.3333 and pixel aspect of 1.0).
    • Assign a Standard Material to the Plane.
      • Assign a Bitmap map to the Diffuse map channel and pick the rendered OpenEXR image of the teapot.
      • Clone the same Bitmap to the Opacity map slot to define the transparency using the Alpha channel.

Baking Particle Positions Into A Mapping Channel

  • Add a Data Operator to the Particle Flow.
    • Set its Update settings to Range from 0 to 0 - this will force it to update only on the first frame.
    • Create an Input Standard operator to get the Particle Position Vector
    • Create an Output Standard operator and set it up to write to the Mapping Channel 2.

Camera Projection Using Data Operator

  • Add a second Data Operator to the Particle Flow. Here is what the Data Operator implementing the camera mapping looks like:

  • First we want to transform the particle position into Camera Space.
    • The Pick Object sub-operator selects the Camera01 from the scene.
    • Using a Object sub-operator, we get the Object TM of the Camera.
    • We pass the transformation matrix to a Function sub-operator with the second input disabled. The only possible function in this case is Inverse.
    • We pass the resulting inverse matrix into the second input of another Function sub-operator.
    • The first input of the same Function sub-op will be fed the content of the Mapping Channel 2 which has been set to the initial particle positions on frame 0 by the first Data Operator. This way, on every frame throughout the animation, the function will always use the original positions of the particles and disregard their current positions which will be affected by wind etc.

  • At this point, we have the particle position transformed into camera space. This means that every particle has a distance to the camera plane along the -Z axis of the camera, and XY coordinates parallel to the image plane. We are interested only in these X and Y coordinates. But we cannot use them directly because each of them has a different Z depth,but we have to project them all onto the SAME image plane.

    • On the above drawing, we see a camera with a FOV of 45 degrees looking at a particle (shown as a white sphere).
    • The Black line represents the view ray from the camera to the particle.
    • The Blue line (partially occluded by the Green line) represents the Z depth in Camera space.
    • The Red line shows the X coordinate of the particle in Camera space. If the particle was in the other half of the camera's cone, it would have a negative X coordinate.
    • The horizontal thin blue line represents the image plane where all particles will be projected onto.
    • The Yellow line is the projection of the X coordinate in the image plane.
    • The Green line represents the distance from the camera to the image plane.
    • As you can see, the triangle defined by the Z coordinate, the X coordinate and the projection of the view ray in the XZ plane is SIMILAR to the triangle defined by the green line, the yellow line and the view ray from the camera to the projection of the particle into the image plane. This means that both triangles have the same angles and thus the proportions of their sides are the same. Thus, X/Z = x'/z'. We can assume the image plane to be at ANY distance from the camera, so let's assume it is at distance of exactly 1.0 units. Thus, X/Z = x'/1 --> x' = X/Z. In other words, to calculate the projection of the X coordinate onto the image plane, we can simply divide it by the Z depth.
    • We also want to normalize the value of x', the projection of the coordinate onto the image plane, so it would be in the range from 0.0 (when X is 0) to 1.0 (when X is equal to the width of the camera's cone at the given depth Z). Of course, since the X can be in the other half of the cone, its X projection would be actually in the range from -1.0 to 1.0. In order to normalize the x', we have to know the proportions of the triangle defined by z' and half the image plane - this triangle has an angle at the camera's point equal to half the FOV, or in this case 45.0/2 = 22.5. Since TAN(22.5) is exactly the proportion of those two sides of the triangle, we now know that when the distance to the image plane is 1.0, the max. possible x' when the particle is at the edge of the camera's cone would be 0.414214. By dividing the result of the X/Z division by the tan(FOV/2), we normalize the x' projection in the range from 0.0 and 1.0 on the one side and -1.0 to 0.0 on the other side of the Z axis.
    • This is why we have set our plane to 2.0 units Width - it represents the Image plane and the point we project onto this plane will correspond to the pixel we want to project back onto the particle!
  • Now let's implement the above into the Data Operator.
    • We use a Vector-->Real Converter sub-operator to grab only the X coordinate of the particle in camera space.
    • We use another Vector-->Real Converter sub-op to grab only the Z coordinate of the particle in camera space.
    • We pass the -Z coordinate through a Function sub-op set to one Input only and using the ABS(R1) function which returns the absolute value without the negative sign.
    • We feed in the X and the abs(Z) values into a Function sub-op where we divide the X by Z.
    • We REPEAT the exact same construction but using the Y coordinate instead of X.
    • At this point the two Function sub-ops contain the unnormalized X and Y coordinates of the particle in the image plane.

  • Next, we want to normalize the image plane coordinates and will read the actual mapping colors.
    • We feed the X and Y coordinates calculated previously into a Real-->Vector Converter. We fill in the Z component using a Scalar with value of 0.
    • In order to scale the X and Y values to match the camera, we have to take into account the Field Of View angle of the camera because it determines what these X and Y really mean. We take the Camera FOV angle of 45 degrees and feed it into a Scalar sub-operator set to Angle.
    • We pass half the angle into a Function sub-op with one Input set to Tangent function (we pre-multiply the angle by 0.5 because we need the angle between the -Z axis of the camera and the camera cone). The result gives us the proportion of the two sides of the triangle defined by -Z and half the image plane width - we can now scale the X and Y values using it to get normalized values.
    • Using a Function sub-op set to Vector/Scalar, we scale the image coordinates of the particle using the result of the Tangent sub-op.
    • Now we can use a Pick Object sub-operator and pick the Plane.
    • A Geometry sub-operator looks for the Closest Point By Surface. We pass the Plane as the Object Input and the Image Plane position calculated above as the Vector 4 Input (V4) defining a particle position (in other words, we PRETEND that a particle has these coordinates and ask for a Pair containing the Face and coordinates of the closest point to that position).
    • We feed the resulting Pair into a Geometry sub-op set to sample the Point Color corresponding to that point on the same Plane object.
    • The resulting color can be finally passed into the Vertex Color Channel Standard Output sub-op which will allow Krakatoa to render the particle color directly.

Krakatoa Rendering

  • We can now render the particle system emitted from the original teapot with colors taken from the Default Scanline rendering projected onto the particles by the Data Operators.
  • We will use Volumetric Density mode with a Density Per Particle of 1.0 / -2, no lighting (the lighting is already baked into the projected map!)
  • Since the particles will be moving, we can enable Motion Blur, Jittered Motion Blur and set the Samples to 8.
  • To integrate the particle into the original scene, you can select the Sphere, Cylinder and Box, expand the Matte Objects rollout and hit the Create/Update Matte Selection Set button to set them as mattes.
  • The output frames take about 50 seconds to render and look like these:

  • Finally, we use the Default Scanline Renderer with just the matte objects to render a single frame - the original Teapot should be hidden and the Particle Flow should be disabled.

  • We can now compose the particle rendering on top of the Scanline rendering of the matte objects

Further Considerations

  • If the camera is moving, you should render the complete sequence in the Default Scanline Renderer or a 3rd party renderer of your choice, then load the sequence into the Map of the Plane to project a different frame on each frame of particle animation. This way, the colors will remain locked to the camera while still following the particle motion.
  • If the Camera FOV is animated, you can also instance the FOV controller from the camera to the Scalar sub-operator in TrackView to keep the FOV data updated automatically.
  • Remember to set the Plane object's Height to match the Image Aspect of the Render Output - the width should always be 2.0, the height should be 2.0/Image Aspect.
  • If you want the particles to move through the projection space, just switch the Standard Input in the second Data Operator from Mapping Channel #2 to Particle Position - the rest of the flow will remain the same. This will pass the actual particle position to the camera mapping and not the one stored on the first frame.

MAXScript Implementation

  • The same functionality can be implemented using a Script Operator, but it would execute much slower - frame 0 with 2 million particles and no Motion Blur needs 22 seconds with the Data Operator implementation and 147 seconds with the Scripted Version (6.68 times slower).
 on ChannelsUsed pCont do
(
pCont.useTM = true
pCont.useVector = true
pCont.useMatrix = true
pCont.useAge = true
)

on Init pCont do ( )

on Proceed pCont do
(
local count = pCont.NumParticles()
local theTM = inverse $Camera01.transform
local theTan = tan ($Camera01.fov/2)
local theBmp = $Plane01.material.diffusemap.bitmap
for i in 1 to count do
(
pCont.particleIndex = i
if pCont.particleAge == 0 do pCont.particleMatrix = pCont.particleTM
local thePos = (pCont.particleMatrix.row4 * theTM)
local theIPos = [1.0 + thePos.x / -thePos.z / theTan , \
1.0 + thePos.y / -thePos.z / theTan * 1.3333 , 0] / 2
pCont.particleVector = ((getPixels theBmp [theIPos.x*(theBmp.width-1), \
(1.0-theIPos.y) * (theBmp.height-1)] 1)[1] as point3)/255.0
)
theBmp = undefined
)

on Release pCont do ( )
  • Here is what the script is doing:
    • The channelsUsed handler is called by Particle Flow to determine which channels will be needed by the operator.
      • We will need the particle TM to get the position
      • The Particle Vector channel to store the color which Krakatoa will interpret as Vertex Color
      • The Particle Matrix channel where we can store custom matrix data
      • The Particle Age channel which will tell us when a particle was born.
    • The Init handler is not used.
    • The Proceed handler will be called by Particle Flow on each integration step.
      • First we get the number of particles in the current event's Particle Container.
      • We prepare the inverse of the Camera's Transformation Matrix
      • We also prepare the TAN of half the FOV angle - no need to calculate these things inside the loop!
      • Finally, we will need the bitmap of the texture map assigned to the Diffuse Map channel of the Plane01.
    • Then we will count from 1 to the number of particles and inside the loop, we will
      • Set the current particle to the variable of the loop
      • Check if the particle was just born and if it was, write its Matrix into the Scripted Matrix channel. We have to do this because the Scripted Vector channel will be needed to store the final color, so we need a place to store the initial position of all particles, and this is the next best place...
      • Next we calculate the particle's position in camera space by multiplying the particle's position taken from the .ROW4 of the matrix with the inverse of the camera matrix.
      • Now we can calculate the projection of this camera position onto the image plane by dividing the X and Y by the Z, dividing both by the TAN of half the FOV, scaling the Y by the Image Aspect of 1.3333, shifting both X and Y by 1.0 so the range changes from -1.0 - 1.0 to 0.0 - 2.0 and finally dividing by 2 to scale the range to 0.0 - 1.0. Note that the Z axis is always negative when the particle is in front of the camera, so instead of using ABS(Z) we can use -Z to get a positive value.
      • Now we can read the pixel from the bitmap corresponding to the image coordinates we calculated. We use the getPixels() method and pass the X and Y multiplied by the pixel width and height where the Y is also flipped upside down (0.0 is in the bottom left corner of the camera but in the top left corner of the image). We read just one pixel, get it out of the resulting array, convert to a Point3, divide by 255 to scale the color into the range from 0.0 - 1.0 and finally assign the result to the Scripted Vector Channel.
    • At the end, we set the bitmap to undefined to clear the variable for garbage collection.
  • In addition to this Script Operator, we also have to add a Krakatoa Options operator and enable MXSVector-->Vertex Color to let Krakatoa "see" the data we generate inside the script.
  • The resulting image is identical to the one produced by the Data Operator.

 

MAXScript Implementation and Animated Bitmap Sequences

  • The method described above works great for static images, but does not work with animations (IFL, MOV or AVI) because the .bitmap value of the BitmapTexture map is not updated correctly as the evaluation time changes.
  • Below is a variation of the Script Operator code which supports animated bitmaps:
 on ChannelsUsed pCont do
(
pCont.useTM = true
pCont.useVector = true
pCont.useMatrix = true
pCont.useAge = true
)

on Init pCont do ( )

on Proceed pCont do
(
local theTime = pCont.getTimeEnd() --get the current time in PFlow
at time theTime --set the time context so camera transform is taken from the right frame
(
local count = pCont.NumParticles()
local theTM = inverse $Camera01.transform
local theTan = tan ($Camera01.fov/2)
local theBmp = openBitmap $Plane01.material.diffusemap.filename --open bitmap by filename
theBmp.frame = theTime.frame as integer --set the frame of the bitmap to the current frame
--you can add or subtract an integer from the above time value to get an offset
--if first frame does not play on frame 0

for i in 1 to count do
(
pCont.particleIndex = i
if pCont.particleAge == 0 do pCont.particleMatrix = pCont.particleTM
local thePos = (pCont.particleMatrix.row4 * theTM)
local theIPos = [1.0 + thePos.x / -thePos.z / theTan , \
1.0 + thePos.y / -thePos.z / theTan * 1.3333 , 0] / 2
pCont.particleVector = ((getPixels theBmp [theIPos.x*(theBmp.width-1), \
(1.0-theIPos.y) * (theBmp.height-1)] 1)[1] as point3)/255.0
)--end i loop
theBmp = undefined
)--end time
)

on Release pCont do ( )