Utilities: Converting Timecode & Frames – Estimate Job Render Time on a Farm

All too often I need to precisely convert between timecode and frames. Or I’ll want a rough time estimate of a render job on the farm. Yet I needed a utility that could allow for intuitive interactivity, GUI creation, and be instantly editable. So I’m using a simple open source solution: Pure Data

Pure Data (aka Pd) is a realtime graphical programming environment which is mainly used in live music performances. But its realtime aspect makes it ideal for my needs. So with some careful planning I applied the math nodes to create tools which automate the conversion process. Basically I’ve created a calculator tailored for animators and filmmakers.

collection of Pd patches

— Requires Pure Data to be installed. (supports: Windows, Mac, Linux)
— It’s easy to use. Just click and drag on the number boxes to input your timecode or frames.

Convert Timecode into Frames /// Convert Frames into Timecode
— Assumes you’re working at 30fps.

Calculate Shot Duration: Input as Timecode or Frames
— Especially useful when adjusting timings between two versions of the animatic and need to figure out the exact amount of time added or removed.
— Assumes you’re working at 30fps.


Estimate Job Render Time on a Farm
— Most useful for estimating the min/max time untill the render is complete. Which is especially important if you’re worried about bottlenecking your farm and need to prioritize for deadlines.

How Does it Work? Lets see the Breakdown
— All the source code is included (within subpatches) and can be edited. So if you’re instead working at 60fps, then you can alter it to your needs.

Jupiter Bands Simulation

Just a few days ago, I found out about a MIT competition called The Art of Astrophysics. Naturally my interest was piqued. The only problem was the deadline… 24 hours!

As a back burner project I’ve been experimenting with creating jupiter cloud bands that are truly fluidic. It’s very difficult to keep the multiple bands separate, so I’ve been testing the interaction between just two bands. The Kelvin–Helmholtz instability is a fascinating topic to focus on, but it’s heavy to simulate and difficult to predict.

Often when I begin these type of experiments, I’m on a grand impossible mission. Those make for the most interesting challenges right? And this time I was on a mission to create jupiter or a hot jupiter as a fluid dynamic system. In the past I’ve created a hot jupiter animated texture in After Effects, using the turbulence displacement and the twist effects. But I wanted a system that was self-reliant. And so, as always, I get stuck on the quest for that holy grail and need a jolt to remind me that often the steps along the way are quite interesting.

So I selected one particularly beautiful simulation in Maya, cached/rendered it, and then spent the rest of the evening tweaking in After Effects. It was fun trying to burn out and degrade the footage to make it appear as if it was raw data from a satellite or space telescope. A bit far fetched, I know, to see an extreme zoom of a hot jupiter, but hey I had a vision and had to run with it. So in homage to that fact, I appropriately named it ‘The (Fictitious) Formation of a Hot Jupiter Cloud Band’.

You might be wondering why I cropped the video in this strange boxed style with burnt edges and film grain… Well I was trying to mimic the the style raw data of a satellite. I actually used this jupiter image as a the source for the border and overall inspiration. I’m not sure if it’s simply a stitch of several images, the sensor blocking out specific light, or such. But I figured it would be a realistic addition instead of the all-too-perfect Maya render.


I’m not going to explain all the details of this Maya simulation, because honestly it’s just alot of trial and error. Lots of playblasts which need overnight to compute. But see below for the notes which can serve as an introduction for further experimentation.

maya scene – jupiter bands simulation

— The fluid box is wrapping left/right, with the top/bottom closed off.
— There are two volume emitters for the top and bottom half that are keyframed to emit density and velocity at the start frame then turn off quickly.
— There are also two “wind” emitters on the sides to keep the incoming velocity at a fixed rate. They use the replace method and have speed emission with a low rate. The target speed is determined by the directional speed attribute. The rate is the amount that changes the velocity on the fluid to that target speed. So the rate is low in order to keeps things going but not influence the simulation too much. But you can confine the shear region more by increasing the rate.
— There is a very small amount of negative density buoyancy to keep things from mixing too much over time. Increasing the strength of the buoyancy will tend to force the boundary closer to the center, providing the solver quality is high enough or else it will collapse to the bottom. Again tricky business.
— High detail solve and a high grid resolution is used to get more fluid detail. The higher the grid resolution of the fluid, then the more substeps and solver quality that are generally needed. Although substeps also determine how fast the fluid flow is. So it’s a tricky balance. If there is still too much diffusion, try yet higher grid resolution, substeps, and solver quality. Also a higher resolution, higher solver quality, and more substeps should create more small scale eddies.
— For a little extra detail you can play around with the Density noise, Density tension, Gradient force. Try tiny values between 0.01 to 0.05

I couldn’t have got this far without the help of Duncan Brinsmead, Principal Scientist at Autodesk. I contacted him because I was curious of his approach to this type of simulation and his response was insightful.

The song in the video is In the Hall of the Mountain King by Edvard Grieg. I am amazed at the beautiful diversity of classical music recordings that are public domain licensed on the MusOpen website. What a serendipitous boon finding this!

More about The Art of Astrophysics Competition:
Astrophysicists try to share the mysteries of the Universe around us in a clear and understandable fashion, but we don’t always succeed. It’s a hard challenge – the wonders of the Solar System, the Galaxy, and the ever expanding Cosmos demand more of our imaginations than can be captured by numbers in a table or terms in an equation. However, a work of art can uniquely inspire us to look closely, to dream freely, to understand openly – anything from the smallest curiosity to the biggest discovery.

So, we’re asking members of the MIT community to create works of art that help us visualize our Universe and how we observe it. Whether you’re a photographer or a poet, a crafter or a coder, a musician or a moviemaker, we want you to use your talents and creativity to illuminate the beauty of astrophysical results. Please consider participating in this year’s Art of Astrophysics competition during MIT’s 2014 Independent Activities Period (IAP), sponsored by the MIT Kavli Institute for Astrophysics and Space Research.

View All of the Art Submissions — View the Winners

Charles River Gallery: LED Bridge

Update: January 18, 2017
Prior to installing the LED screens onto the Charles River Bridge, we wanted to get a feel for the experience it would create in the lobby. So we used a 3D model of the Museum of Science lobby to create a virtual tour that could be watched in the the planetarium dome. It was a great way to be immersed in the project prior to it actually being built. The Jupiter band simulation was chosen to be included in the official launch of the LED bridge.

Background Stars v2

example_render(This post is an update to Background Stars v1. It will provide some context.)

A while back I shared a ‘star globe’ which has the night sky mapped onto a sphere. This can be used to completely surround a maya scene with stars. Andrew Hazelden and I often collaborate on various fulldome projects and he had an idea of how to re-engineer the star globe into requiring only 1 surface and 1 file texture. This allows for a vast improvement in rendertime. For instance:
— 4-poly & 4 texture star globe – 1m 40s
— 1-poly & 1 texture star globe – 30s

A bunch of other improvements are included:
— Fixes the grey blurry line glitch since it uses a Mental Ray texture network.
— A 2k texture for previewing in the Maya viewport. Then 8k texture used for renders.
— Other lights in the scene will not affect the star globe.
— Star globe never casts shadows.
— Star globe will automatically show up in reflections & refractions.
— Renders faster since the 1 texture needs much less initialization and poly is reduced.
— Here is a detailed explanation of these things are achieved.

maya scene – star globe
(also available in the Domemaster3D toolset)


Fake Ambient Occlusion


Ambient occlusion is quite a helpful effect in easily achieving some realism. But the default render layer occlusion setup in Maya is somewhat render intensive. And when you’re doing lots of render tests, well you just want to see it quickly.

fake-ambient-occlusion-tutorialSo we can speed up the render process by instead using Final Gather to computer the occlusion and then use the mib_fg_occlusion utility shader to make it grayscale. Then by adjusting the Final Gather render settings we can affect the render times. This is really helpful if you are doing a test render where the render quality is not paramount, but later you can increase the settings for the final render.

To the right is a tutorial of how to get the mib_fg_occlusion node setup within a surface shader. You must use mental ray since that is what Final Gather requires.

You can also learn more in depth about the mib_fg_occlusion utility shader and how to get a texture with transparency to be recognized.


The Nebula Challenge

One of the most difficult types of space imagery to create is a volumetric nebula. There are three main styles of nebulae to imitate: diffuse nebulae, planetary nebulae, and supernova remnants. The fluid framework within Maya is extremely flexible but it can be very tricky to just get a fluid/emitter set up with settings that are repeatable.

So to ease the cumbersome setup, below is a maya scene for the interactive creation of a fluid nebula. While experimenting with the attributes it’s important to have a real nebula reference image in mind; or else you’ll just continue tweaking attributes without any real measure of when it’s done. With fluids it’s quite easy to run down endless rabbit holes… There are just so many attributes that are interlocked. But it’s not impossible to tame fluids, you just need to have a goal in mind.

The nebula maker template is initially set up to have the look of a supernova shell because it’s actually more useful to experiment with. But I’ll share how to adjust the opacity ramp to create a diffuse nebula.

Wade Sylvester, a Science Visualizer here in the planetarium, is a master of fluids, particles, and environment creation. All of this work has been pioneered by his endless experiments and dreams. So with this fluid container/emitter you can create a smörgåsbord of nebulae. Below I’ll outline some of the specific attributes to experiment with.

If you find yourself wondering what a specific attribute does then you should check out the fluidShape article. It is very detailed and thorough.

maya scene – nebula maker template

Progressive experiments with various attributes.

What’s Going On Here?
Basically the ‘FluidEmitter’ is emitting a very simple fluid into the ‘NebulaFluid’ container. The ‘NebulaFluid’ container then is further refining that fluid with tons of available attributes. So this tutorial is a deep explanation of how to best control the ‘NebulaFluid’ container attributes.

After the fluid is emitted, it can then be shaded by defining its color, incandescence, and opacity. Then you can further refine the look of the fluid by adding a texture. This is a volumetric texture that is cutting out the fluid to make it look irregular and organic. The texture is not created by the emitted fluid, but is instead a separate fractal system overlaid into the fluid.

First Steps
1) Lets make sure you can preview the fluid in the viewport as best as possible. In the viewport menubar: Shading / click ‘Smooth Shade All’ AND check ‘Hardware Texturing’. This really helps to directly see some of the attributes being changed in the viewport.
2) Lets also make sure to let the fluid computation control how fast the timeline moves: Right-click on timeline / Playback Speed / select: ‘Play Every Frame, Free’
3) Rewind the timeline to 1. Then let it play until around frame 18. You’re welcome to explore anywhere in the timeline. But something to remember with fluids is that they have the most intricate of details right when their emitted. So frame 18 is a place where it has nice density and detail.
4) Experiment with the attributes outlined in the tutorial below.


Done? Finalize It
So you’re done experimenting and want to freeze the nebula into place. Now we are going to give the fluid container an initial state and stop emitting fluid.
1) It’s very important to make sure that you have a project set. The initial state is NOT stored within the maya scene file. So if you don’t have your project set, then you will likely lose your initial state when next opening Maya and your fluid will instead look blank.
(example location: project/cache/SceneName.mb_fluidShape1.mcfi)
2) If you want to change your fluid Resolution then this is the last chance. But be sure to re-emit via the timeline!
3) Select the NebulaFluid container. Then in the menubar, go to Fluid Effects and click Set Initial State.
(If this doesn’t work then try: Solvers / Initial State / Set For Selected)
4) Then in the NebulaFluidShape tab of the attribute editor and check the box for Disable Evaluation.
5) DONE! Now you can animate a camera and the nebula is locked in place. If you want to animate the nebula, I would suggest experimenting with Texture Time. But any of the shading & texture attributes are still editable and key-able.
(Note: to see any keyed textures in the viewport, you must have the NebulaFluid selected and the attribute editor open!)

Examples of the different types of nebulae that can be created.

House of Mirrors

Creating the sense of an huge expansive space is a challenge in 3D animation. But if we use mirrors and some trickery then we can create a vast beautiful illusion. The trick is to use structural beams to hide the mirrors edges. This creates a seemingly unending structural lattice. An infinite jungle gym!

maya scenes – cube, sphere, dodecahedra (flat & curved)



Tips & Tricks
— If you want to fly through the lattice gridwork like in the video above, line up several of the mirror boxes and delete faces to form a tunnel.
— Lets say that you want to look at the dodecahedra and see its infinite space from the outside looking in. Well you can easily do this by making a one-sided surface. (polyShape/Render Stats/Double Sided (and use Opposite to flip it))
You can limit the amount of recursive reflections by changing the allowed raytracing reflections. (Render Settings/Quality/Raytracing/Reflections)
A fluid is being used to make the space appear foggy with each successive reflection. To see deeper into the recursion, adjust the opacity ramp. (fog_fluidShape/Shading/Opacity)
If you want to add your own fluid, just make sure to check ‘Visible in Reflections’. (fluidShape/Render Stats/Visible in Reflections)

This technique was originally engineered by Duncan Brinsmead, a principal scientist at Autodesk’s R&D. I’ve simply added on top of his amazing work. If you’re curious to learn more in-depth about this technique then check out this screencast.

Customizing a Close-Up Sun

Have you ever wanted to fly up close to a sun? To see those dynamic boiling details, shimmering corona, and mesmerizing surface textures patterns… But creating an animated volumetric sun with all these attributes is quite difficult.

It turns out that you can constrain a Maya fluid into a sphere. But the fluid system is tricky to work with since there are so many different attributes to explore and figure out their intertwined relationships. So it’s important that you already have in mind the type of star you want to create. Otherwise you’ll end up tweaking details endlessly without any basis for when it’s finished.

The foundation of this fluid sun is actually an example found within the Maya Visor. But it has since been highly customized to reach the current look. We wanted the sun to have more detailed surface textures, wispy corona, and controllable animation.

Just to say it upfront, understand that this approach is meant to be more stylized than realistic. We lean in this direction because it’s believable, dynamic, and beautiful. Audiences undoubtedly know that it’s a star. The trade-off is that solar prominence’s are ignored, though I’m sure you could create separate fluid systems to address this (Home Run Pictures has done some beautiful work in this vein). Obviously it can’t solve all of our problems but it offers us many options to work with.

The most difficult and time consuming part is just setting up the fluid. But you can skip over this step since I’ve shared Maya scene below. But you can watch this video tutorial if you’re curious how to make a fluid sun from scratch.


maya scene – fluid sun

Examples of the different types of stars that can be created.

How to Customize the Sun
This sun can be easily customized once you understand some specific attributes outlined below. All of these attributes can be found within the sunFluidShape tab of the attribute editor. As you will undoubtedly see, there are TONS of other attributes that I’m not covering here simply because narrowing your options makes things manageable and you can focus on being creative. Check out the fluidShape article for in-depth attribute explanations.

Speed Up Render Times to Experiment Quicker
While experimenting with the attributes listed below, I would suggest lowering the Shading Quality to greatly reduce render times. (sunFluidShape/Shading Quality/Quality)

Adjusting the Resolution can change the overall look of how much detail is in the sun, but not in a way you expect. Resolution determines how many voxels (volumetric pixels) are allowed. So a higher resolution does not mean better quality, it’s just a different look. I suggest keeping the Resolution XYZ values equal. Higher values mean longer render times. The default values are a good place to begin. (sunFluidShape/Resolution)

Need the sun to be a different color and brightness?
(sunFluidShape/Shading/Color tab & Incandescence tab)
— First play with the color gradient and then the incandescence gradient. There is a relationship here that can only be understood by playing with it and doing test renders. It’s best to try and keep these two gradient looking similar, this reduced confusion.
— Glow plays an important role in making it seem blown out and bright. (sunFluidShape/Shading/Glow Intensity) Be aware that if you don’t let a test render fully finish, then the post-render glow won’t be applied. I mention this because the post-render glow plays an important part in making it appear bright but still controlling the amount of blown out details.

Need the sun surface texture to have bigger/smaller elements?
(sunFluidShape/Textures tab)
— Play with the Ratio attribute first to get foundation. Then adjust the Frequency Radio attribute to adjust the detail added ontop. The Implode attribute can do some interesting things too. Threshold and Amplitude can be weird but fun to experiment with.
— The textures play a particularly important role in a fluid sun. For instance, if you are creating a blue super-giant; by making the surface elements smaller then it will help to make the star feel truly huge.

Need the sun to look like it’s in the process of forming?
(sunFluidShape/Textures tab)
— Reducing the Frequency attribute to very low numbers (between .001 to .5) can produce some interesting effects that make the sun appear as if it’s missing chunks. Almost like swiss cheese. This is an experimental technique that could be animated to create the moment of a sun igniting.

Need the corona to be more opaque or reach farther out?
(sunFluidShape/Shading/Opacity tab)
— This gradient is where your sun is restricted to a tight sphere but still allow some rays to leak through. Think of the vertical peaks in the gradient as where you want the fluid to show up. The horizontal being how close to the center of the fluid container you are. By adjusting the gradient and watching the viewport you can get a rough preview of whats being affected.

Need the surface textures and corona to boil (animate)?
(sunFluidShape/Textures tab)
— Key the Texture Time attribute. For a subtle but effective simmering look, I generally have an increase of .25 per 30 frames. This is already keyed in the downloadable example.

Need the sun to spin? Or move around?
(sunFluid = Rotate XYZ of the channel box)
— Simple. Just key the Rotate X, Y, or Z of the sunFluid. In other words, just animate the fluid container itself. To move the sun just key the Translate XYZ. The same for Scale XYZ.

Possible Problems and Solutions
— This fluid setup renders pretty fast. Know that the closer you get to the sun, the higher your render times will be. I’m not talking drastic just something to be aware of. Rendering at production quality is definitely overkill and you won’t see a different from preview quality. You might even be able to away with draft quality, but beware of flickering or aliasing. A simple batch render test of 90 frames will inform you of whats best.
— If you want to better understand how fluids work, then here are two tutorials that will really lay the groundwork for you. Maya fluid testing and Fire effect.
— To avoid any problems with different fluids overlapping or any flickering: go to the Render Globals/Features/Extra Features/Volume Samples: 4
— If a moiré pattern is visible during animation, you can slowly increase the shading quality. (sunFluidShape/Shading Quality tab) Be aware that this will increase render times. But there is a delicate relationship between Quality and Contrast Tolerance to balance this.
— We have never needed a need to cache our fluids, even with a renderfarm.
— If you want to light-link a fluid, make sure not to have it grouped. Instead point-constrain it. This is due to how the fluidshape node show up within the light linking editor.
— If for some reason you have a shadow casting from the sun, then select the sun and go to the sunFluid within the attribute editor, then spin down the mental ray tab, flags tab, uncheck ‘derive from maya’, and then set shadows to no.


The Hemicube Camera Rig

Hemicube-Camera-TutorialIn some cases, you can’t use a fisheye camera to render a domemaster. So the solution is to create a hemicube camera rig.

But what is a hemicube camera rig? By setting up 5 different cameras into a specific arrangement you can seamlessly capture 180° of footage. And then with some stitching software you can later on merge the hemicube renders into a domemaster.

The tutorial to the right was created for Maya users, but the concepts could easily be replicated within in any CGI toolset.

After you have completed the tutorial and rendered out footage from the hemicube cameras, now you’re ready to stitch them together.

Maya Download
hemicube camera rig for Maya

— Check out my custom Maya camera for fulldome production which has the hemicube rig included.
— Here is a hemicube camera for Blender

CubeMap Camera Rig & Lens Shader
A cubemap is a typical hemicube camera setup where all the camera output their renders into one combined image (instead of outputting one image per camera). Since fulldome producers are often dealing with large resolutions, a cubemap can be a little too cumbersome to work with. But they can still be helpful for environment maps and such.
CubeMap camera rig – Maya
CubeMap lens shader – Maya, 3DS Max, or Softimage. (Mac build)

Stitching Hemicube Renders into Fisheye or Spherical

hemicube-to-domemaster-visualizeSource of hemicube render – Source of idea for explanation

Stitching hemicube renders into fisheye

So you’ve rendered a scene from a 3D animation software and used a hemicube camera. Now you’re left with 2k footage from each of the 5 cameras: top, left, right, front, and back. But now what? Well you need to stitch the hemicube renders into a domemaster (fisheye).

But lets be clear from the start. Typically when someone says stitching, they mean that there is some automatic algorithmic analysis of how to best combine the footage. But in this instance we simply mean that we are specifically placing footage on the dome and having the dome distortion compensated for. Just like the image above.

There are several fulldome plugins for After Effects that make this process quite simple: Navegar FulldomeSky-Skan DomeXF, or Digistar Virtual Projector. I’ll be using the Navegar Fulldome Plugin in this tutorial.

AfterEffects-Screenshot-hemicube-to-domemasterDump all your footage into After Effects and give each hemicube render its own layer in the same comp (example image to the right). Then apply the fulldome plugin to all the layers and change the settings below for each hemicube layer (leave everything else to default).

Left (East)
— Altitude: 0
— Azimuth: 90
— Angular Width/Height: 90

Right (West)
— Altitude: 0
— Azimuth: -90
— Angular Width/Height: 90

Front (South)
— Altitude: 0
— Azimuth: 180
— Angular Width/Height: 90

Back (North)
— Altitude: 0
— Azimuth: 0
— Angular Width/Height: 90

Top (Zenith)
— Altitude: 90
— Azimuth: 0
— Rotation: 180 (may not be needed)
— Angular Width/Height: 90

Bug: Scale Needs Reset
Sometimes the scale says it’s showing 100%, but it actually isn’t functioning correctly. You can reset it by setting the angular width to 0, clicking away, and then setting it back to 90. This happens often enough that it’s apart of my workflow. You’ll know if it’s happening if all the hemicube renders look too small.

Seeing More than 180°
Once you have everything assembled, you can actually change with the spherical angle to see more of the scene. It’s typically at 180° to give you a perfect fisheye, but often it’s helpful to compress more like 200° into the dome. This allows you to actually see the nearby ground. It’s a powerful technique that doesn’t bring any noticeable distortion and actually enhances immersion. As you can see below, this extra 20° made all the difference for seeing the lakes of Titan. It doesn’t look like much here but on the dome it’s dramatic.


Other Stitching Software
Don’t have After Effects? Here are some alternative options for stitching.
— Glom (Spitz)
PineappleWare Stitcher
Domemaster Photoshop Actions Pack
— Domemaster (Linux)
— Hugin
Nuke (The Foundry) – Spherical Transform Node
Fusion (BlackMagic Design) – Cubemap to Equirectangular
Autopano Video (Kolor)

Stitching hemicube renders into spherical


With the same technique as discussed above, stitching together hemicube renders into the equirectangular projection (spherical) is quite simple.

Here are the only differences. They need to be set for each hemicube layer,
— Spherical Angle: 360
— Master Projection: Cylindrical Equidistant

Feel free to examine the After Effects scene from which the screenshot above was taken. Just be aware that you must have the Navegar Fulldome Plugin for it to function properly.

after effects scene / required: navegar fulldome plugin

Flying through a Galaxy Field

When I first saw the Hubble ultra deep field photos, I was in awe. Our brief glimpses out into the universe brings a certain deep peace to my mind. A mysterious and majestic connection to the cosmos. So I undoubtedly needed to create a galaxy field in Maya…

The galaxy distribution currently has a randomized layout. Meaning I used a MEL script to randomize the translateXYZ and rotateXYZ of each galaxy (more info below). So it does not currently mimic the cosmic web. In this first iteration of creating a galaxy field, the goal was simply to fly among the galaxies in 3D. But in postproduction you could easily put a real Hubble ultra deep field photo as the backdrop of everything, just to help the illusion.

There are 68 different galaxies images that have been photoshopped to paint out the background stars/galaxies. It is important to have the image edges fade to true black or else the image edges might be visible and ruin the effect.

I didn’t know if using 68 images would be enough to create a galaxy field and still feel dynamic in the amount of unique galaxies, but it works surprisingly well. Every so often two duplicate galaxies will appear close to each other and you can simply delete or move one of them away.

maya scenes – galaxy field experiments

About the Different Options
Every project has its own demands. Sometimes you need to fly through a bunch of galaxies and other times just a few. There are four options for amounts of galaxies:
100 galaxies
1,000 galaxies
10,000 galaxiesOnly navigate in wireframe or smooth-shade mode.
100,000 galaxiesBe wary of opening this scene. It will bog down your computer and use ~6GB RAM. Only navigate in wireframe mode.

There is also three types of galaxies/layouts:
FlatGalaxyField_GridLayoutFlat galaxies in a grid. Fun for art experiments.
FlatGalaxyField_VolumeLayoutFlat galaxies that have a randomized layout.
ParabolaGalaxyField_VolumeLayoutParabola galaxies that have a randomized layout. Using a parabola as the galaxy poly helps the galaxy to not feel paper-thin. Of course this adds a hundred times the amount of total polygons; so the 100,000 galaxy amount isn’t possible for this option.

Other Details
— If you want to re-distribute the galaxies to be more or less densely laid out then check out SMO_RandomizeXForm. If you want to randomize the shaders then check out RandomShader. These are the tools I used. FYI, both of these MEL scripts are included in the download above.
— Every galaxy has its own poly (not an instance). This allows us a few advantages, such as being able to use animation deformers or a lattice to play with the galaxies themselves. You can also edit the poly of an individual galaxy and it won’t affect all the others.
— Each galaxy image is mapped into the color, transparency, and incandescence of its own lambert shader. It could have been done procedurally, but now we have the freedom to tweak a specific shader without affecting every galaxy.
— The galaxy photos themselves can be found in this folder: galaxy_project/sourceimages

Adding Your Own Galaxy Images
— Notice how all the images are square? Well this makes working with them in Maya much easier since all the planes are square. Otherwise your galaxy images will get stretched to fit the Maya plane. Also, I scale down any images to a max of 4096px to save on memory in Maya. Even this is a little large.
— Know that incandescence and transparency have a particular relationship where even if an area is completely transparent, the incandescence will still be visible and be see-through. So the image edges MUST fade to true black or else you’ll see the image edges in your Maya renders.

Next Steps
In the future, I’ll experiment with how to give each galaxy an actual thickness and not look paper-thin. This is actually a pretty tricky thing to accomplish using the image as the a driver for a volume/fluid and still render quickly. I also want to create custom luminosity maps for each galaxy shader so that they appear more dense at the center of each galaxy. And most importantly, instead of a randomized layout of galaxies I would love to mimic the cosmic web and have the look of clusters, filaments, and voids. This is a work in progress and anyways I can’t give away the secret sauce!