Recently I’ve been collaborating with N.E.S.T. Immersion to create an exciting tool for VJ-ing. And it’s launched as of today!
NestDrop is a tool for performing with high-resolution high-fps generative visuals which react in real-time to the music and then broadcasts the video via Spout. Since the Milkdrop engine is at the core you can easily bring in your own presets. Use any audio source to drive the visuals, even live audio.
Presets Collection – Cream of the Crop
I’ve spent a ton of time curating a best-of collection of presets for VJ’s to perform with. Over the years the Milkdrop community has released about 52,000 presets and so I selected only the best ones and ended up with 9,795 presets. I also organized the presets into categories and subcategories.
– Collection of 9,795 Milkdrop presets
– Includes preview images of each preset
– Readme with install instructions
All too often I need to precisely convert between timecode and frames. Or I’ll want a rough time estimate of a render job on the farm. Yet I needed a utility that could allow for intuitive interactivity, GUI creation, and be instantly editable. So I’m using a simple open source solution: Pure Data
Pure Data (aka Pd) is a realtime graphical programming environment which is mainly used in live music performances. But its realtime aspect makes it ideal for my needs. So with some careful planning I applied the math nodes to create tools which automate the conversion process. Basically I’ve created a calculator tailored for animators and filmmakers.
collection of Pd patches
— Requires Pure Data to be installed. (supports: Windows, Mac, Linux)
— It’s easy to use. Just click and drag on the number boxes to input your timecode or frames.
Just a few days ago, I found out about a MIT competition called The Art of Astrophysics. Naturally my interest was piqued. The only problem was the deadline… 24 hours!
As a back burner project I’ve been experimenting with creating jupiter cloud bands that are truly fluidic. It’s very difficult to keep the multiple bands separate, so I’ve been testing the interaction between just two bands. The Kelvin–Helmholtz instability is a fascinating topic to focus on, but it’s heavy to simulate and difficult to predict.
(This post is an update to Background Stars v1. It will provide some context.)
A while back I shared a ‘star globe’ which has the night sky mapped onto a sphere. This can be used to completely surround a maya scene with stars. Andrew Hazelden and I often collaborate on various fulldome projects and he had an idea of how to re-engineer the star globe into requiring only 1 surface and 1 file texture. This allows for a vast improvement in rendertime. For instance:
— 4-poly & 4 texture star globe – 1m 40s
— 1-poly & 1 texture star globe – 30s
A bunch of other improvements are included:
— Fixes the grey blurry line glitch since it uses a Mental Ray texture network.
— A 2k texture for previewing in the Maya viewport. Then 8k texture used for renders.
— Other lights in the scene will not affect the star globe.
— Star globe never casts shadows.
— Star globe will automatically show up in reflections & refractions.
— Renders faster since the 1 texture needs much less initialization and poly is reduced.
— Here is a detailed explanation of these things are achieved.
maya scene (using mental ray) – star globe
(also available in the Domemaster3D toolset)
Ambient occlusion is quite a helpful effect in easily achieving some realism. But the default render layer occlusion setup in Maya (using Mental Ray) is somewhat render intensive. And when you’re doing lots of render tests, well you just want to see it quickly.
So we can speed up the render process by instead using Final Gather to computer the occlusion and then use the mib_fg_occlusion utility shader to make it grayscale. Then by adjusting the Final Gather render settings we can affect the render times. This is really helpful if you are doing a test render where the render quality is not paramount, but later you can increase the settings for the final render.
To the right is a tutorial of how to get the mib_fg_occlusion node setup within a surface shader. You must use mental ray since that is what Final Gather requires.
You can also learn more in depth about the mib_fg_occlusion utility shader and how to get a texture with transparency to be recognized.
One of the most difficult types of space imagery to create is a volumetric nebula. There are three main styles of nebulae to imitate: diffuse nebulae, planetary nebulae, and supernova remnants. The fluid framework within Maya is extremely flexible but it can be very tricky to just get a fluid/emitter set up with settings that are repeatable.
So to ease the cumbersome setup, below is a maya scene for the interactive creation of a fluid nebula. While experimenting with the attributes it’s important to have a real nebula reference image in mind; or else you’ll just continue tweaking attributes without any real measure of when it’s done. With fluids it’s quite easy to run down endless rabbit holes… There are just so many attributes that are interlocked. But it’s not impossible to tame fluids, you just need to have a goal in mind.
The nebula maker template is initially set up to have the look of a supernova shell because it’s actually more useful to experiment with. But I’ll share how to adjust the opacity ramp to create a diffuse nebula.
Creating the sense of an huge expansive space is a challenge in 3D animation. But if we use mirrors and some trickery then we can create a vast beautiful illusion. The trick is to use structural beams to hide the mirrors edges. This creates a seemingly unending structural lattice. An infinite jungle gym!
maya scenes (using mental ray) – cube, sphere, dodecahedra (flat & curved)
Have you ever wanted to fly up close to a sun? To see those dynamic boiling details, shimmering corona, and mesmerizing surface textures patterns… But creating an animated volumetric sun with all these attributes is quite difficult.
It turns out that you can constrain a Maya fluid into a sphere. But the fluid system is tricky to work with since there are so many different attributes to explore and figure out their intertwined relationships. So it’s important that you already have in mind the type of star you want to create. Otherwise you’ll end up tweaking details endlessly without any basis for when it’s finished.
In some cases, you can’t use a fisheye camera to render a domemaster. So the solution is to create a hemicube camera rig.
But what is a hemicube camera rig? By setting up 5 different cameras into a specific arrangement you can seamlessly capture 180° of footage. And then with some stitching software you can later on merge the hemicube renders into a domemaster.
The tutorial to the right was created for Maya users, but the concepts could easily be replicated within in any CGI toolset.
After you have completed the tutorial and rendered out footage from the hemicube cameras, now you’re ready to stitch them together.
hemicube camera rig for maya
— Check out my custom Maya camera for fulldome production which has the hemicube rig included.
— Here is a hemicube camera for Blender
Source of hemicube render – Source of idea for explanation
Stitching hemicube renders into fisheye
So you’ve rendered a scene from a 3D animation software and used a hemicube camera. Now you’re left with 2k footage from each of the 5 cameras: top, left, right, front, and back. But now what? Well you need to stitch the hemicube renders into a domemaster (fisheye).
But lets be clear from the start. Typically when someone says stitching, they mean that there is some automatic algorithmic analysis of how to best combine the footage. But in this instance we simply mean that we are specifically placing footage on the dome and having the dome distortion compensated for. Just like the image above.