RSS< Twitter< etc
Main | 'Ludicrously fast' production-ready rendering »
Thursday
Feb022017

Creating Out-of-this-World Clouds with Krakatoa 

The opening sequence of Paramount’s “Star Trek Beyond” takes viewers on an awe-inspiring journey through a colorful interstellar dust and gas cloud. Though scientists have captured incredible real-life visuals through space exploration, imagery of this nature is often largely accomplished through CG artistry. Alaa Al Nahlawi, head of VFX Arabia, recently shared a tutorial on how to create a CG nebula using Autodesk 3ds Max, PFlow, Sitni Sati’s FumeFX and Krakatoa.

Nahlawi’s main technique for creating the cloud body starts with generating a FumeFX cloud from a deformed mesh. To achieve a more defined look, he uses the Fume simulation with its density channel and loads it into Krakatoa using PRT FumeFX loader to deliver a smokier effect.

“Krakatoa renders millions or even billions of particles with lighting and shading so fast since particles don’t need a polygon face for rendering. I’ve added Krakatoa to my arsenal whenever particle rendering is needed; it enhances my existing production pipeline. I don’t know any plugin out there that can handle such a massive amount of particles in terms of rendering time and shading control,” said Nahlawi.


Depending on their origin and age, nebulas come in many shapes and colors, which means their appearance can vary greatly. Once an artist has determined the desired look of a nebula, they can move to 3ds Max and block out underlying geometry that FumeFX will use as an emission source. It’s important that the simulation contain a significant amount of detail and voxels for a realistic result. Since this particular shot is a camera fly-through, only one frame of the simulation is actually needed.

“The beauty of working with FumeFX in Krakatoa is that we can pump up the particle count with the Subdivision Region option, and also have the ability to utilize the density channel to drive color across an image sequence,” Nahlawi explained.


Delivering the appropriate color gradient is also a key part of the process. It should be bright and hot at the center and cold and dusty towards the edges. Once complete, the artist can then apply a Magma modifier and following, map the density channel, feed it to a texture coordinate and apply a Krakatoa material with a gradient map. In some cases, the artist might prefer to use a large 3D noise map and feed it to a density channel to break the continuous shape of the fume puff.

“When I first set out to make the nebula, working with the Krakatoa Magma modifier was new to me so I dug out every learning material from the Thinkbox website and YouTube; I have a strong background dealing with nodes flow and CG math, which helped. Having worked with Cebas Thinking Particles, it took me only two days to master everything I needed; it’s an amazing feature,” said Nahlawi.  

For the stars pass, the artist uses the same FumeFX cloud but applies a different Magma Flow. First, the whole cloud needs to be filtered, selecting a star-like pattern using a 3D noise map. The remaining elements can then be deleted. Adding color is the next step to ensure a level of self-illumination through the emission channel. When lighting, a top and center light provide the optimal look. The shadow map should be turned on and Krakatoa will read the resolution. Higher resolutions result in longer render times, but details will be more pronounced. Though rendering in Krakatoa is highly customizable, an artist may only want to modify absorption, emission, final pass density and lighting density to keep things simple. Absorption should be set to blue so that light will shift towards warmer colors as it passes through the cloud, but density can be adjusted until the desired effect is realized. Once the emission channel is activated, the stars will glow.

“Krakatoa has a cache functionality so when you render a cached animation, all the color, emission, absorption, lighting and shadow information can be calculated only once, stored to RAM and applied across the whole animation, which will save your life. In my case, it took around 20 min and 45GB of RAM to calculate everything I need for the first frame and the rest about two to four minutes, depending on how many particles are in the view to draw; we are talking about 1.1 billion particles,” said Nahlawi.

Once rendered, the image sequence is imported to The Foundry’s NUKE to add additional color, motion blur, retiming, grain and star glare.

Check out Nahlawi’s full tutorial here