House of Mirrors

Creating the sense of an huge expansive space is a challenge in 3D animation. But if we use mirrors and some trickery then we can create a vast beautiful illusion. The trick is to use structural beams to hide the mirrors edges. This creates a seemingly unending structural lattice. An infinite jungle gym!

maya scenes – cube, sphere, dodecahedra (flat & curved)



Tips & Tricks
— If you want to fly through the lattice gridwork like in the video above, line up several of the mirror boxes and delete faces to form a tunnel.
— Lets say that you want to look at the dodecahedra and see its infinite space from the outside looking in. Well you can easily do this by making a one-sided surface. (polyShape/Render Stats/Double Sided (and use Opposite to flip it))
You can limit the amount of recursive reflections by changing the allowed raytracing reflections. (Render Settings/Quality/Raytracing/Reflections)
A fluid is being used to make the space appear foggy with each successive reflection. To see deeper into the recursion, adjust the opacity ramp. (fog_fluidShape/Shading/Opacity)
If you want to add your own fluid, just make sure to check ‘Visible in Reflections’. (fluidShape/Render Stats/Visible in Reflections)

This technique was originally engineered by Duncan Brinsmead, a principal scientist at Autodesk’s R&D. I’ve simply added on top of his amazing work. If you’re curious to learn more in-depth about this technique then check out this screencast.

Customizing a Close-Up Sun

Have you ever wanted to fly up close to a sun? To see those dynamic boiling details, shimmering corona, and mesmerizing surface textures patterns… But creating an animated volumetric sun with all these attributes is quite difficult.

It turns out that you can constrain a Maya fluid into a sphere. But the fluid system is tricky to work with since there are so many different attributes to explore and figure out their intertwined relationships. So it’s important that you already have in mind the type of star you want to create. Otherwise you’ll end up tweaking details endlessly without any basis for when it’s finished.

The foundation of this fluid sun is actually an example found within the Maya Visor. But it has since been highly customized to reach the current look. We wanted the sun to have more detailed surface textures, wispy corona, and controllable animation.

Just to say it upfront, understand that this approach is meant to be more stylized than realistic. We lean in this direction because it’s believable, dynamic, and beautiful. Audiences undoubtedly know that it’s a star. The trade-off is that solar prominence’s are ignored, though I’m sure you could create separate fluid systems to address this (Home Run Pictures has done some beautiful work in this vein). Obviously it can’t solve all of our problems but it offers us many options to work with.

The most difficult and time consuming part is just setting up the fluid. But you can skip over this step since I’ve shared Maya scene below. But you can watch this video tutorial if you’re curious how to make a fluid sun from scratch.


maya scene – fluid sun

Examples of the different types of stars that can be created.

How to Customize the Sun
This sun can be easily customized once you understand some specific attributes outlined below. All of these attributes can be found within the sunFluidShape tab of the attribute editor. As you will undoubtedly see, there are TONS of other attributes that I’m not covering here simply because narrowing your options makes things manageable and you can focus on being creative. Check out the fluidShape article for in-depth attribute explanations.

Speed Up Render Times to Experiment Quicker
While experimenting with the attributes listed below, I would suggest lowering the Shading Quality to greatly reduce render times. (sunFluidShape/Shading Quality/Quality)

Adjusting the Resolution can change the overall look of how much detail is in the sun, but not in a way you expect. Resolution determines how many voxels (volumetric pixels) are allowed. So a higher resolution does not mean better quality, it’s just a different look. I suggest keeping the Resolution XYZ values equal. Higher values mean longer render times. The default values are a good place to begin. (sunFluidShape/Resolution)

Need the sun to be a different color and brightness?
(sunFluidShape/Shading/Color tab & Incandescence tab)
— First play with the color gradient and then the incandescence gradient. There is a relationship here that can only be understood by playing with it and doing test renders. It’s best to try and keep these two gradient looking similar, this reduced confusion.
— Glow plays an important role in making it seem blown out and bright. (sunFluidShape/Shading/Glow Intensity) Be aware that if you don’t let a test render fully finish, then the post-render glow won’t be applied. I mention this because the post-render glow plays an important part in making it appear bright but still controlling the amount of blown out details.

Need the sun surface texture to have bigger/smaller elements?
(sunFluidShape/Textures tab)
— Play with the Ratio attribute first to get foundation. Then adjust the Frequency Radio attribute to adjust the detail added ontop. The Implode attribute can do some interesting things too. Threshold and Amplitude can be weird but fun to experiment with.
— The textures play a particularly important role in a fluid sun. For instance, if you are creating a blue super-giant; by making the surface elements smaller then it will help to make the star feel truly huge.

Need the sun to look like it’s in the process of forming?
(sunFluidShape/Textures tab)
— Reducing the Frequency attribute to very low numbers (between .001 to .5) can produce some interesting effects that make the sun appear as if it’s missing chunks. Almost like swiss cheese. This is an experimental technique that could be animated to create the moment of a sun igniting.

Need the corona to be more opaque or reach farther out?
(sunFluidShape/Shading/Opacity tab)
— This gradient is where your sun is restricted to a tight sphere but still allow some rays to leak through. Think of the vertical peaks in the gradient as where you want the fluid to show up. The horizontal being how close to the center of the fluid container you are. By adjusting the gradient and watching the viewport you can get a rough preview of whats being affected.

Need the surface textures and corona to boil (animate)?
(sunFluidShape/Textures tab)
— Key the Texture Time attribute. For a subtle but effective simmering look, I generally have an increase of .25 per 30 frames. This is already keyed in the downloadable example.

Need the sun to spin? Or move around?
(sunFluid = Rotate XYZ of the channel box)
— Simple. Just key the Rotate X, Y, or Z of the sunFluid. In other words, just animate the fluid container itself. To move the sun just key the Translate XYZ. The same for Scale XYZ.

Possible Problems and Solutions
— This fluid setup renders pretty fast. Know that the closer you get to the sun, the higher your render times will be. I’m not talking drastic just something to be aware of. Rendering at production quality is definitely overkill and you won’t see a different from preview quality. You might even be able to away with draft quality, but beware of flickering or aliasing. A simple batch render test of 90 frames will inform you of whats best.
— If you want to better understand how fluids work, then here are two tutorials that will really lay the groundwork for you. Maya fluid testing and Fire effect.
— To avoid any problems with different fluids overlapping or any flickering: go to the Render Globals/Features/Extra Features/Volume Samples: 4
— If a moiré pattern is visible during animation, you can slowly increase the shading quality. (sunFluidShape/Shading Quality tab) Be aware that this will increase render times. But there is a delicate relationship between Quality and Contrast Tolerance to balance this.
— We have never needed a need to cache our fluids, even with a renderfarm.
— If you want to light-link a fluid, make sure not to have it grouped. Instead point-constrain it. This is due to how the fluidshape node show up within the light linking editor.
— If for some reason you have a shadow casting from the sun, then select the sun and go to the sunFluid within the attribute editor, then spin down the mental ray tab, flags tab, uncheck ‘derive from maya’, and then set shadows to no.


The Hemicube Camera Rig

Hemicube-Camera-TutorialIn some cases, you can’t use a fisheye camera to render a domemaster. So the solution is to create a hemicube camera rig.

But what is a hemicube camera rig? By setting up 5 different cameras into a specific arrangement you can seamlessly capture 180° of footage. And then with some stitching software you can later on merge the hemicube renders into a domemaster.

The tutorial to the right was created for Maya users, but the concepts could easily be replicated within in any CGI toolset.

After you have completed the tutorial and rendered out footage from the hemicube cameras, now you’re ready to stitch them together.

Maya Download
hemicube camera rig for Maya

— Check out my custom Maya camera for fulldome production which has the hemicube rig included.
— Here is a hemicube camera for Blender

CubeMap Camera Rig & Lens Shader
A cubemap is a typical hemicube camera setup where all the camera output their renders into one combined image (instead of outputting one image per camera). Since fulldome producers are often dealing with large resolutions, a cubemap can be a little too cumbersome to work with. But they can still be helpful for environment maps and such.
CubeMap camera rig – Maya
CubeMap lens shader – Maya, 3DS Max, or Softimage. (Mac build)

Stitching Hemicube Renders into Fisheye or Spherical

hemicube-to-domemaster-visualizeSource of hemicube render – Source of idea for explanation

Stitching hemicube renders into fisheye

So you’ve rendered a scene from a 3D animation software and used a hemicube camera. Now you’re left with 2k footage from each of the 5 cameras: top, left, right, front, and back. But now what? Well you need to stitch the hemicube renders into a domemaster (fisheye).

But lets be clear from the start. Typically when someone says stitching, they mean that there is some automatic algorithmic analysis of how to best combine the footage. But in this instance we simply mean that we are specifically placing footage on the dome and having the dome distortion compensated for. Just like the image above.

There are several fulldome plugins for After Effects that make this process quite simple: Navegar FulldomeSky-Skan DomeXF, or Digistar Virtual Projector. I’ll be using the Navegar Fulldome Plugin in this tutorial.

AfterEffects-Screenshot-hemicube-to-domemasterDump all your footage into After Effects and give each hemicube render its own layer in the same comp (example image to the right). Then apply the fulldome plugin to all the layers and change the settings below for each hemicube layer (leave everything else to default).

Left (East)
— Altitude: 0
— Azimuth: 90
— Angular Width/Height: 90

Right (West)
— Altitude: 0
— Azimuth: -90
— Angular Width/Height: 90

Front (South)
— Altitude: 0
— Azimuth: 180
— Angular Width/Height: 90

Back (North)
— Altitude: 0
— Azimuth: 0
— Angular Width/Height: 90

Top (Zenith)
— Altitude: 90
— Azimuth: 0
— Rotation: 180 (may not be needed)
— Angular Width/Height: 90

Bug: Scale Needs Reset
Sometimes the scale says it’s showing 100%, but it actually isn’t functioning correctly. You can reset it by setting the angular width to 0, clicking away, and then setting it back to 90. This happens often enough that it’s apart of my workflow. You’ll know if it’s happening if all the hemicube renders look too small.

Seeing More than 180°
Once you have everything assembled, you can actually change with the spherical angle to see more of the scene. It’s typically at 180° to give you a perfect fisheye, but often it’s helpful to compress more like 200° into the dome. This allows you to actually see the nearby ground. It’s a powerful technique that doesn’t bring any noticeable distortion and actually enhances immersion. As you can see below, this extra 20° made all the difference for seeing the lakes of Titan. It doesn’t look like much here but on the dome it’s dramatic.


Other Stitching Software
Don’t have After Effects? Here are some alternative options for stitching.
— Glom (Spitz)
PineappleWare Stitcher
Domemaster Photoshop Actions Pack
— Domemaster (Linux)
— Hugin
Nuke (The Foundry) – Spherical Transform Node
Fusion (BlackMagic Design) – Cubemap to Equirectangular
Autopano Video (Kolor)

Stitching hemicube renders into spherical


With the same technique as discussed above, stitching together hemicube renders into the equirectangular projection (spherical) is quite simple.

Here are the only differences. They need to be set for each hemicube layer,
— Spherical Angle: 360
— Master Projection: Cylindrical Equidistant

Feel free to examine the After Effects scene from which the screenshot above was taken. Just be aware that you must have the Navegar Fulldome Plugin for it to function properly.

after effects scene / required: navegar fulldome plugin

Flying through a Galaxy Field

When I first saw the Hubble ultra deep field photos, I was in awe. Our brief glimpses out into the universe brings a certain deep peace to my mind. A mysterious and majestic connection to the cosmos. So I undoubtedly needed to create a galaxy field in Maya…

The galaxy distribution currently has a randomized layout. Meaning I used a MEL script to randomize the translateXYZ and rotateXYZ of each galaxy (more info below). So it does not currently mimic the cosmic web. In this first iteration of creating a galaxy field, the goal was simply to fly among the galaxies in 3D. But in postproduction you could easily put a real Hubble ultra deep field photo as the backdrop of everything, just to help the illusion.

There are 68 different galaxies images that have been photoshopped to paint out the background stars/galaxies. It is important to have the image edges fade to true black or else the image edges might be visible and ruin the effect.

I didn’t know if using 68 images would be enough to create a galaxy field and still feel dynamic in the amount of unique galaxies, but it works surprisingly well. Every so often two duplicate galaxies will appear close to each other and you can simply delete or move one of them away.

maya scenes – galaxy field experiments

About the Different Options
Every project has its own demands. Sometimes you need to fly through a bunch of galaxies and other times just a few. There are four options for amounts of galaxies:
100 galaxies
1,000 galaxies
10,000 galaxiesOnly navigate in wireframe or smooth-shade mode.
100,000 galaxiesBe wary of opening this scene. It will bog down your computer and use ~6GB RAM. Only navigate in wireframe mode.

There is also three types of galaxies/layouts:
FlatGalaxyField_GridLayoutFlat galaxies in a grid. Fun for art experiments.
FlatGalaxyField_VolumeLayoutFlat galaxies that have a randomized layout.
ParabolaGalaxyField_VolumeLayoutParabola galaxies that have a randomized layout. Using a parabola as the galaxy poly helps the galaxy to not feel paper-thin. Of course this adds a hundred times the amount of total polygons; so the 100,000 galaxy amount isn’t possible for this option.

Other Details
— If you want to re-distribute the galaxies to be more or less densely laid out then check out SMO_RandomizeXForm. If you want to randomize the shaders then check out RandomShader. These are the tools I used. FYI, both of these MEL scripts are included in the download above.
— Every galaxy has its own poly (not an instance). This allows us a few advantages, such as being able to use animation deformers or a lattice to play with the galaxies themselves. You can also edit the poly of an individual galaxy and it won’t affect all the others.
— Each galaxy image is mapped into the color, transparency, and incandescence of its own lambert shader. It could have been done procedurally, but now we have the freedom to tweak a specific shader without affecting every galaxy.
— The galaxy photos themselves can be found in this folder: galaxy_project/sourceimages

Adding Your Own Galaxy Images
— Notice how all the images are square? Well this makes working with them in Maya much easier since all the planes are square. Otherwise your galaxy images will get stretched to fit the Maya plane. Also, I scale down any images to a max of 4096px to save on memory in Maya. Even this is a little large.
— Know that incandescence and transparency have a particular relationship where even if an area is completely transparent, the incandescence will still be visible and be see-through. So the image edges MUST fade to true black or else you’ll see the image edges in your Maya renders.

Next Steps
In the future, I’ll experiment with how to give each galaxy an actual thickness and not look paper-thin. This is actually a pretty tricky thing to accomplish using the image as the a driver for a volume/fluid and still render quickly. I also want to create custom luminosity maps for each galaxy shader so that they appear more dense at the center of each galaxy. And most importantly, instead of a randomized layout of galaxies I would love to mimic the cosmic web and have the look of clusters, filaments, and voids. This is a work in progress and anyways I can’t give away the secret sauce!


Creating Earth’s Atmosphere


There are many aspects to think about when creating earth: terrain color/bump map, cloud layer casting shadows, ocean specular/color map, night city-lights map, and of course an atmosphere. But the trickiest part of creating a good looking earth is keeping the clouds white without the atmosphere tinting them blue.

I learned this technique from the two video tutorials below. What I particularly enjoy about this technique is that the planet shadow casting on the atmosphere isn’t created by a light, instead it’s controlled by a color ramp in the atmosphere shader. So you can directly control where the atmosphere falls off and imitate light scatter. In other words, it doesn’t matter where you place your light because the atmosphere is a system of its own. This means faster render times since there no volume shadows to compute.

By scaling the volume shape you can easily adjust how far out the atmosphere extends. Need to change the thickness of the atmosphere? Just adjust the edge dropoff of the shader.

When you need to have a planet revolving around a sun, well things get a little tricky since the atmosphere shadow isn’t created by a light. But the solution is to simply key the rotation to match the movement. It’s not ideal but hey, kiss it!

I’ve been quite pleased with the simplicity of using a volume primitive for an atmosphere. I’ve not experimented too much with part 2 where they replace the sphere fog shader for the volumetric fluid shader. The added wispy detail is beautiful yet it’s simply not realistic for creating a planet. (but maybe a sun?!) I’m curious to experiment with it merely out of its shadowcasting ability, but I’ve learned alot from working with fluids and textured surfaces together… flickering, odd falloff streaking, crepuscular rays, and such bugs that are so stupidly simple and yet difficult to fix.

maya scene – earth with atmo, color, bump, spec, and trans

Source of Tutorials

Creating a Star Field

One of my favorite things is to fly through a star field in the dome. It’s those particular moments when the dome seemingly disappears and your imagination takes over. It’s truly a majestic and thought provoking experience. But how can you make a star field thats easy to manipulate and renders efficiently?

In the spirit of creating a star field that is reusable but still realistic, we chose to mimic the star distribution of a main sequence star field. Of course it’s all editable if you need. But for all practical purposes this template works beautifully for a flight between star systems. You just need to choose the density of stars.

We decided to go with four different star colors that are the foundation for all the star sprites. They are designed to simulate the look of Sky-Skan’s DigitalSky stars since we use DigitalSky for our star globe backgrounds. You would think that only 4 different star color images being repeated thousands of times on the star sprites wouldn’t be enough variability. But the truth is that within the dome it’s all about the immersion of flying among the stars. Everyone is focused on the grand sense of scale.

Each of the four star colors gets its own particle emitter.
There are two reasons for this:

1) We can easily control the amount and scale of the orange, yellow, white, and blue stars. Meaning that for each star color we can keep the emitter rate equal to the distribution of star types of a main sequence star field. We can also force an allowable range for the scale to be randomized. This means that the colors of stars can correlate to the average size of the star.
2) Mental ray doesn’t play nicely when the particle shape node has more than one texture. There are some hacky solutions for this, but this multiple emitter solution actually grants us more control of the spites since we can manipulate the star color groups individually. Sometimes you just gotta kiss it.

Star Field Stats
— Emitter1: Orange stars – 5000/sec – scale .8 to 1.2
— Emitter2: Yellow stars – 4500/sec – scale 1 to 1.5
— Emitter3: White stars – 600/sec – scale 1.5 to 2
— Emitter4: Blue stars – 300/sec – scale 2 to 4

Whats happening here?
Maya has an excellent particle system that is fisheye compatible. So we use a spherical volume emitter and adjust the ‘away from center’ to 0. This insures that particles are emitted randomly throughout the volume and stay stationary. We use sprites as the particle render type. Sprites are simply a special type of image plane that will always face the camera. We then map a texture onto the image plane. This texture is an stylistic image of a star; I created it from scratch in Photoshop. All of this is already setup in the star field template maya scene. I’m just sharing a bit of framework thoughts for those interested.

maya scene – star field template

(Either read below or see the tutorial to the right)
starfield-tutorial1) Open the StarFieldTemplate.mb maya scene file.
2) Make sure you have your scene project setup. Otherwise your initial state will get lost when you open up this scene again.
3) Choose density by ticking the timeline frame-by-frame. I always tend to need much more than I initially imagine. Try placing a test camera in the star field and doing some renders.
4) When satisfied, set an initial state for each of the 4 particle shapes.
5) Zero out the rate for each of the emitters. (but do not ever delete them!)
6) Delete the ppScale creation expression for each of the emitters.
7) Done! Now you can scrub the timeline and the sprites are locked in place. Try a test render!

Don’t Add Lights to the Stars
These stars don’t need to be lit with a light. They have incandescence that makes them self-lit. Any added lights should be light linked away or else it will contribute to lighting the sprites and make the stars brighter than expected.

If you have any issues rendering the sprites…
Go to the Render Globals / Quality / Raytracing / Acceleration / Acceleration Method: change to Regular BSP. Then try BS2.

Editing the Randomized Scale Range
1) Go to the sprite shape node in the attribute editor / Per Particle (Array) Attributes / right click on ‘Expression…’ next to Sprite Scale Y PP / click Creation Expression.
2) Edit the numbers in the last part of the expression text area: rand(.8,1.2)
3) When finished, click ‘Edit’ in the Expression Editor and then close.

Hide Certain Sprites within the Star Field
1) Set your particle lifespan to ‘lifespanPP only’.
2) Right click on the particles and enter component mode.
3) Go to Window > General Editors > Component Editor
4) Find the ‘Particles’ tab, then find the ‘lifespanPP’ field.
5) Now you can select the sprites you want hidden in the viewport. Then type in 0 for their lifespanPP in the Component Editor.
6) After finished, you MUST initial state (again).


Fisheye & Spherical Collection: Supported Render Engines

Maya Scene Source – fisheye image / rendered with Domemaster3D

So you want to dive into creating fulldome content? Get ready for a wild and weird experience. You’re going to need a camera with a fisheye lens. Call it what you like: fisheye, fulldome, or domemaster… They all refer to the same thing.

A Note for VR Animators
3D Animation: Fisheye Lens Shaders (Plugins)
3D Animation: Render Engines with Built-in Fisheye
Compositing Software
Automatic Projection Conversion
Dome Mapping Software
Realtime Performance Options
VJ App Plugins
Virtual Dome Playback
Video Game Engine: Fisheye Options
Academic Research
Astronomy Software for Planetariums
— 360 Video & Panoramic Photo Stitching
Presenting in the Dome
Pseudo Fisheye Lens Shaders (Plugins)
Hacks for Faking Fisheye (After Effects)
Photography: Fisheye Lens Research
When All Else Fails… Hemicube Camera
Hacky Solution… Mirror Ball Camera

Updated with the latest fulldome & VR software on January 5, 2018

A Note for VR Animators

While my focus here is to collect software which renders fisheye, I have also kept my eye on software which supports spherical rendering. I’m not as thorough in this respect, but you will see a fair amount of mentions to spherical and I hope this helps out the VR crowd. Know that the terms Spherical, Equirectangular, and LatLong refer to the EXACT same thing.

Maya Scene Source – spherical monoscopic image / rendered with Domemaster3D

Maya Scene Source – spherical stereoscopic image (over-under) / rendered with Domemaster3D

3D Animation:
Fisheye Lens Shaders (Plugins)

A render engine that offers a built-in fisheye is a recent expectation and it’s quickly becoming the standard. But there are plugins which can add a fisheye lens shader into a render engine that doesn’t have it built-in yet. Within the fulldome industry, Domemaster3D is a popular choice.

— Requires use of Mental Ray (also supports V-Ray and Arnold)
— Supports: Maya, 3DS Max, Softimage
— Installer includes both the monoscopic and stereoscopic fisheye lens shaders. Also includes monoscopic and stereoscopic spherical lens shaders.
— The wiki page provides an excellent explanation of the many included tools. Also this fulldome 3d introduction article is very helpful in understanding how to create stereoscopic content for a dome.
— Be aware of the grey blurry line fix and the Maya physical sky fix.
— Original source: DomeAFL & Roberto Ziche & Andrew Hazelden

Cinema 4D – (supporting prior to R19)
— Blendy 360 Cam – monoscopic and stereoscopic – supports: spherical (monoscopic and stereoscopic), fisheye, cubemap, cylindrical, little planet
— WFCam4D – supports fisheye and spherical
— DomeCam – supports fisheye
— GreyScaleGorilla Camera – supports fisheye
CV-VR Cam – supports spherical (monoscopic and stereoscopic)

— DomeMaster – supports fisheye
— Houdini Fisheye Camera – supports fisheye
Houdini Stereo Spherical Panorama Camera – supports spherical

— Liberty3D UberCam / more info – supports fisheye and spherical

— Oblique FX Obq_Shaders (fisheye support using Obq_AngularCamera / very unique shader of interest: Obq_KettleUVStereoLens)
— Oblique FX Obq_Shaders ports available for Maya, Softimage, Cinema4D, Houdini

The Good Old Fisheye Camera – supports fisheye for Blender Internal
DomeCam – supports fisheye for Blender Internal (but requires Hugin for stitching the hemicube renders)

Mental Ray
LatLong lens shader – supports: Maya, 3DS Max, Softimage, Mental Ray Standalone – supports spherical
CubeMap – supports spherical

3D Animation:
Render Engines with Built-in Fisheye

These render engines have fisheye support built into their framework. Most offer it within the camera attributes.

— Create immersive previz renderings of scenes in fisheye, spherical, cylindrical, and cubic formats.
— Supports: Maya, Softimage
— Documentation

— Built-in fisheye and spherical for Cycles Render Engine
— Check out the Blendertarium forum
— Blender branch: Spherical Stereoscopic

— Supports: 3DS Max, Cinema 4D, Maya, Rhino, SketchUp, Softimage
Built-in fisheye, spherical, and stereoscopic spherical
Guide to Virtual Reality

Cinema 4D
Built-in fisheye, spherical, cubic-cross, cubic-3×2, cubic string
— Support added in R19

— Supports: 3DS Max, Maya
— Built-in fisheye and spherical

— Supports: Maya, Softimage, Houdini, Katana
Built-in fisheye and spherical
Creating an VR Camera Node (Maya)
VR Camera / source code

Maxwell Render
— Supports: 3DS Max, Cinema 4D, Houdini, LightWave, Maya, Modo, Rhino, SketchUp, Softimage
— CAD toolsets: ArchiCAD, Bonzai3d, Form·Z, MicroStation, SolidWorks, Revit
Built-in stereoscopic fisheye and stereoscopic spherical
Webinar: tips for the new stereo VR lens
PanoView Tool

FurryBall RT
— Supports: Maya, Cinema 4D, 3DS Max
Built-in spherical

Built-in fisheye and spherical

Built-in fisheye and spherical

Clarisse iFX
— Built-in fisheye and spherical

— Fisheye supported but needs a custom lens shader (none publicly available)
Built-in spherical
— Experimental support for stereoscopic spherical
— Any RenderMan compliant renderer supported (PRMan, 3Delight, Gorilla Render)

— Fisheye supported but needs a custom lens shader (none publicly available)
Built-in spherical

Redshift 3D
Built-in fisheye, spherical, and stereoscopic spherical

Built-in fisheye and spherical

Vue xStream Plugin for Maya
— Fisheye partially supported for Maya. But there are a few requirements: 1) Mental Ray is used as the renderer. 2) Install Domemaster3D and apply its fisheye lens shader to the camera. 3) Camera motion must be animated within Maya.
— Unknown if this technique works for 3DS Max, Cinema 4D, LightWave, or Softimage.

3DS Max
Built-in spherical: Panorama Exporter

Mandelbulb: 3D fractals
Mandelbulb 3D – built-in fisheye and spherical
Mandelbulber – built-in fisheye and spherical

Unknown Support of Fisheye in the following Render Engines:
— Vue – built-in spherical
— Octane – built-in spherical
— Modo – built-in spherical
— Corona Renderer – built-in spherical
— Indigo Renderer – built-in spherical
— Iray
— Fabric Engine
— FluidRay
— KeyShot
— Thea Render

Compositing Software

Being able to composite your render layers, with fisheye in mind, can be quite tricky. And there isn’t much built-in support for the fulldome production workflow. But there are a wide variety of plugins which add fisheye functionality.

Navegar Fulldome
— After Effects plugin
— Also includes an immersive camera After Effects and Cinema 4D.
— Supports fisheye and spherical
— Supports conversion of: Azimuthal Equidistant (Fisheye), Aitoff, Hammer-Aitoff, Gnomonic (Standard), Stereographic, Ortographic, Vertical perspective, Cylindrical equidistant, Panoramic equidistant, Spherical Mirror, Sinusoidal, Hemicube, TOAST
Documentation / workflow tutorial 1 & workflow tutorial 2

Nuke / Cara VR Plugin
— Supports fisheye and spherical (monoscopic and stereoscopic)

Skybox Studio
— After Effects plugin
— Supports spherical, and mirror ball, fisheye, cubic (Facebook, Pano2VR, NVIDIA)
Tutorial: Motion Tracking and Object Replacement with 360 Footage
Tutorial: Mocha Pro 5 compatibility

Skybox 360/VR Tools
SkyBox Post FX
SkyBox Post FX 2
SkyBox 360/VR Transitions
SkyBox 360/VR Transitions 2
— Plugins for After Effects and Premiere Pro
— 360 seamless effects: blur, denoise, glow, sharpen, fractal noise, digital glitch, chromatic aberrations, color gradient
— 360 transition effects: mobius zoom, random blocks, gradient wipe, iris wipe, light rays, chroma leaks, light leaks, spherical blur.
— Supports spherical (monoscopic and stereoscopic)
Tutorial: SkyBox Post FX
Tutorial: Skybox 360/VR Tools / Documentation

SkyBox VR Player
— Includes plugins for After Effects, Premiere Pro, and SpeedGrade which automatically creates a bridge to easily view media in a VR headset
— Two modes available = 360 Mode: View your creation in 360° space (Spherical) in your Rift / Workspace Mode: Work in After Effects or Premiere Pro. Great for tweaking things while wearing your Rift.
— Supports spherical (monoscopic and stereoscopic)

— After Effects plugin
— Supports spherical, fisheye, super fisheye, rectilinear, super wide-angle zoom, mirror ball, and parabolic lens-mirror projection formats
Overview video

— After Effects 3D template
— Supports fisheye and spherical
Documentation / video tutorial / workshop

Adobe Premiere Pro
Spherical support announced

Adobe After Effects
— If you need to convert 360 video to fisheye so it can be watched in a dome, then After Effects can do the trick (using built-in effects). Yet this technique has no ability to adjust the camera angle and you cannot change the FOV accurately. It’s just a very basic conversion.

Domemaster Fusion Macros
— Collection of custom macros for Fusion
— Supports fisheye and spherical
— Includes a camera rig that works with Fusion’s native 3D animation system
— Supports conversion of: Angular2CubicFaces, Angular2Equirectangular, Cubemap3x22CubicFaces, CubicFaces2Cubemap3x2, CubicFaces2GearVRMono, CubicFaces2GearVRStereo, CubicFaces2HorizontalCross, CubicFaces2HorizontalStrip, CubicFaces2HorizontalTee, CubicFaces2MrCube1Map, CubicFaces2VerticalCross, CubicFaces2VerticalStrip, CubicFaces2VerticalTee, CubicRenderer3D, Domemaster2Equirectangular, Equirectangular2Angular, Equirectangular2CubicFaces, Equirectangular2Domemaster180, Equirectangular2InverseAngular, Equirectangular2InverseDomemaster180, EquirectangularStereo2GearVRStereo, FisheyeMask, GearVRMono2CubicFaces, GearVRStereo2CubicFaces, HorizontalCross2CubicFaces, HorizontalStrip2CubicFaces, HorizontalTee2CubicFaces, MayaBackgroundGradient, MayaBackgroundGradientCubicFaces, MayaBackgroundGradientEquirectangular, MRCube1HorizontalStrip2CubicFaces, Offset, StereoAnaglyphHalfColorMerge, StereoAnaglyphMerge, StereoOverUnderExtract, StereoOverUnderMerge, StereoSideBySideExtract, StereoSideBySideMerge, UVAngular2EquirectangularGradientMap, UVDomemaster2EquirectangularGradientMap, UVEquirectangular2AngularGradientMap, UVEquirectangular2DomemasterGradientMap, UVGradientMap, UVPassFromRGBImage, VerticalCross2CubicFaces, VerticalStrip2CubicFaces, VerticalTee2CubicFaces

Domemaster Photoshop Actions Pack
— Collection of custom Photoshop actions
— Supports fisheye and spherical
— Supports conversion of: Inverse Angular Fisheye, Angular Fisheye to Equirectangular, Angular Fisheye to 2:1 Equirectangular, 180° Domemaster to 2:1 Equirectangular, Equirectangular to Angular Fisheye, 2:1 Equirectangular to Angular Fisheye, 2:1 Equirectangular to 180° Domemaster, 3×2 Cube Map to Cube Map, Vertical Cross to Cube Map, Horizontal Cross to Cube Map, Vertical Tee to Cube Map, Horizontal Tee to Cube Map, Vertical Strip to Cube Map, Horizontal Strip to Cube Map, Gear VR Mono to Cube Map, Cube Map to 3×2 Cube Map, Cube Map to Vertical Cross, Cube Map to Horizontal Cross, Cube Map to Vertical Tee, Cube Map to Horizontal Tee, Cube Map to Vertical Strip, Cube Map to Horizontal Strip, Cube Map to Gear VR Mono, Cube Map to New Cube Map, Cube Map Rotate X:+90 Degrees, Cube Map Rotate Y:+90 Degrees, Cube Map Rotate Z:+90 Degrees, Stereo Side by Side Extract, Stereo Over Under Extract

— Supports fisheye and spherical

Mocha VR
— Supports spherical

Super Cubic
— Collection of Photoshop filters
— Supports spherical

GoPro VR Plugins: VR Horizon & VR Reframe
— Premiere Pro Plugin
— Supports spherical

SUGARfx Revolve 360
— Plugin for Final Cut Pro X, Motion, After Effects, Premiere Pro
— Supports spherical

360VR Toolbox /// 360VR Express
— Plugin for Final Cut Pro, Motion, Premiere Pro, After Effects
— Can convert between fisheye and spherical

Canvas 360
— Plugin for After Effects
— Supports spherical

Liquid Cinema
— Supports spherical

Wonda VR
— Supports spherical
Software demo

— After Effects plugin
— Supports fisheye

Digistar Virtual Projector
— After Effects plugin
— Supports fisheye

Automatic Projection Conversion

Below are some options which are convert between various projections automatically. The options for keyframeable conversion are listed above in the Compositing Software section.

Flexify 2
— Photoshop plugin
— Supports fisheye, spherical, & wide array of projections / preview output modes

— A command line script that uses the open source Panotools library & MPRemap application to automate the process of converting image sequences between multiple panoramic formats.
— Supports conversion of: angular 2 cylindrical, angular 2 LatLong, dome 2 rectangular, LatLong 2 cubemap 3×2, LatLong 2 cubic, LatLong 2 cylindrical, LatLong 2 dome, LatLong 2 horizontal cross, LatLong 2 rectangular, LatLong 2 Mental Ray horizontal strip cube1, LatLong Stereo 2 GearVR stereo, LatLong 2 vertical cross

Realtime Conversion of Spherical to Fisheye
VR to Dome Convertermore info
Pano2Dome – openFrameworks App
Pano2Fishy – Quartz Composer patch

Realtime Warping of Movie Playback for Domes
Quartz Composer – warp mesh patch / example of use – supports fisheye
VLC Warp Movie Player – supports fisheye
— Quicktime Warp Movie Player – supports fisheye

NVIDIA Texture Tools
— Photoshop plugin
— Convert between different cubemap layout formats (GearVR, Octane Render, Vray Cubic 6:1 horizontal strip format)

libcube2cyl is a panoramic library for cubic to cylindrical projection conversion. Available in both C and C++.

Hemicube Stitching Software
PineappleWare Stitcher
Domemaster Stitcher

Environment Mapping Tools
— Supports conversion of: rectangular, mirror ball, fisheye, hemi fisheye, cubemap

Dome Mapping Software

Building a physical dome is one challenge, but mapping the dome to work with multiple video projectors is a whole different challenge. But using one computer for the dome projection enables all sort of amazing opportunities. All of the options below support the open source Syphon technology, which means that many VJ apps can easily be used (such as Resolume, VDMX, Modul8, Millumin, CoGe, etc.)

Blendy Dome VJ
— A realtime domemaster slicer designed for Fulldome VJing.
— Slice/warp up to 8 projectors in realtime
— Inputs accepted: fisheye and spherical
— Supports Syphon

— A tool developed for artists, VJ’s, set designers & VR creators to bring immersive experiences back into the real world. At the core it’s an arbitrary surface projection mapping tool.
Documentation / Software demo
— Slice/warp 6 projectors in realtime (unlimited with commercial license)
— Projection mapping modes: whole sphere, fulldome, box (cave), planar
— Inputs accepted: fisheye, spherical, cubic, planar
— Supports Syphon

— Multi-channel projection software designed for domes. Provides realtime warping and slicing for domemaster input.
Documentation / Presentation from IMERSA 2015: Intro to vDome
— Slice/warp unlimited projectors in realtime
— Inputs accepted: fisheye
— Supports Syphon

— A multi-projector modular mapping software primarily targeted toward fulldome mapping.
— Provided that the user creates a 3D model with UV mapping of the projection surface, Splash will take care of calibrating the videoprojectors (intrinsic and extrinsic parameters, blending and color), and feed them with the input video sources. Splash can handle multiple inputs, mapped on multiple 3D models, and has been tested with up to eight outputs on two graphic cards.
Documentation / codebase activity log / SAT info / author info
— Slice/warp up to 8 projectors in realtime (may support more)
— Inputs accepted: fisheye
— Supports Syphon

— Automatically calibrate your projectors with a compatible webcam and process real-time stitching and blending of a Spout shared video stream
— Supports Spout

— A platform for realtime interactive animation on a dome. It manages both 2D and 3D content for projection onto a fulldome installation with multiple projectors.
— Slice/warp 4 projectors in realtime
— Inputs accepted: fisheye
— Supports Syphon

Realtime Performance Options

With the rise of fulldome, VJing, and creative coding as an artform… it was only a matter of time before the artists were knocking down the doors.

— Built-in fisheye and spherical
— Supports Spout and NDI

— Built-in fisheye (includes support for MirrorDome projection warping)
— Supports Syphon

Domemaster addon – fisheye addon
ofxEquiMap – spherical addon
Syphon plugin

Touch Designer
Dome camera rig / example of use – fisheye addon
Projection TOP – built-in to convert a cubemap to fisheye or spherical

Dome Pack – fisheye addon
Dome camera rig – fisheye addon
LatLong camera rig – tutorial for spherical support

Domemaster patch – fisheye addon
Syphon plugin

Domemaster template – fisheye addon
Planetarium library – fisheye addon (supports Syphon) – more info
Syphon plugin

— Built-in fisheye and spherical

VR-Plugin – explore Maya scenes in realtime stereo 3D (with a HMD)

— Media playback software – built-in fisheye and spherical

— Fulldome realtime music visualizer – built-in fisheye

— Fulldome realtime music visualizer – built-in fisheye
— Supports Syphon & Spout

— Javascript framework to build immersive experience for the web – supports spherical

VJ App Plugins

Plugins for VJ softwares which help you to convert footage for use in the dome

Fulldome VJ
— FreeFrame GL (FFGL) plugins
— Collection of effects and mixers which can be used in any software which utilizes FFGL plugins

360 VJ Pack
— Resolume plugins
— Realtime conversion of spherical to fisheye, rectilinear to fisheye, and fisheye adjustment

Virtual Dome Playback

If you don’t have access to a physical dome, then you can simulate your content on a virtual dome.

Amateras Dome Player
— Built-in fisheye and spherical

— Built-in fisheye
— Supports Syphon

Auto-Parasitic Light Calculation Tool
— Tool which simulates how projected light bounces off the dome and washes out the other side of the dome (also known as cross-bounce).
— Built-in fisheye

Navegar DomeView
— Built-in fisheye, cylindrical, spherical
— Able to import a custom 3D model of the planetarium

Domemaster3D DomeViewer
— Built-in fisheye, cylindrical, spherical, cubic (along with 12 other panoramic viewing modes)
— Available for Maya and Softimage

Inside Dome
— Built-in fisheye

DomeMod & DomeTester
— Built-in fisheye
DomeMod (web browser version) / source code

VR Headset Video Players
Whirligig – built-in fisheye (140°, 180°, 240°, 360°), spherical, cubic
GoPro VR Player (previous known of Kolor Eyes) – built-in spherical
Scratch Play / Scratch VR – built-in spherical, ASC CDL color-correction tools
Live View Rift – built-in fisheye, cylindrical, spherical
RV Player – built-in spherical (how-to enable it)
Total Cinema 360 – built-in spherical
VR Player – built-in fisheye, cylindrical, spherical, cubic, planar
RiftMax – built-in spherical
Virtual Desktop – built-in spherical
360VR Viewer – built-in spherical
VorpX – built-in spherical

VLC Media Player
VLC 360 – built-in spherical
VLCSyphon – includes a Syphon server
— Built-in fisheye and spherical (monoscopic and stereoscopic)

Video Game Engine: Fisheye Options

The bleeding edge of fulldome work is in adding an element of interaction. Here are a few pieces of software which support fisheye.

Blender: Game Engine
— Built-in fisheye / example of use
Built-in spherical
— Add-on: BlenderVR

Fulldome Camera For Unity – Supports Syphon, BlackSyphon, Spout, NDI
— Domemaster package
Data Dome Unity Toolkitmore info / documentation
Custom fisheye camera rig / more info
— Spherical rendering (not realtime): VR Panorama 360 Renderer, Spherical Image Cam, 360 Panorama Capture
Syphon plugin / Spout plugin

— Unity3D camera – supports fisheye
tutorial 1 & tutorial 2

Omnimap API
— Supports fisheye
— OpenGL, DirectX9, and DirectX10

— Video game capture
built-in spherical support (mono and stereo)

Unreal Engine
— No fisheye solution yet, but spherical rendering is supported (not realtime)
UE4 Stereo 360 Movie Export Plugin
DIY spherical rendering – requires the engine to be built from source code
Stereo Pano Exporter thread

SGCT – Simple Graphics Cluster Toolkit
— Cross-platform C++ library for developing OpenGL applications that are synchronized across a cluster of image generating computers.

Paul Bourke
— Interactive fulldome research

Fisheye Video Games
Fisheye Quake & PanQuake
Dome Breaker
Fulldome Space Shooter
Fulldome Bootcamp

The Witness
— Documentation on how to create a fisheye lens shader

Unknown Support of Fisheye in the following Game Engines:
— CryEngine
— Stingray Engine
— Source Engine
— Leadwerks Engine
— id Tech 4
— NeoAxis 3D Engine

Academic Research

A random assortment of interesting research papers and experimental research that I’ve stumbled across.

Projection from Spheres
— Various transformations and projections as they apply to computer graphics

Rendering Omni-directional Stereo Content (Google Cardboard Dev)
— Pseudocode examples of how to create a virtual ODS camera

Perceptually-Based Compensation of Light Pollution in Display Systems
— “When projecting onto a dome, the projected image is corrupted by indirect scattering (cross-bounce). Our techniques retains more of these dark details while maintaining a final observed image that is perceptually closer to the unpolluted original image.”

Panoramic Video from Unstructured Camera Arrays
— “We describe an algorithm for generating panoramic video from unstructured camera arrays. In this paper we extend the basic concept of local warping for parallax removal.”

The AlloSphere
— Research facility with a unique spherical theater
— Check out their open-source software, specifically Cosm

Fast Non-Linear Projections using Graphics Hardware
— Research paper with full source code available

Real-Time Spherical Panorama Image Stitching Using OpenCL
— Research paper about webcam stitching in realtime

Astronomy Software for Planetariums

DigitalSky, Uniview, and Digistar are the main softwares used by large planetariums. WorldWide Telescope, Stellarium, Nightshade, and Starry Night Dome are often used by portable domes or those on a tight budget.

World Wide Telescope
Shira Universe
Starry Night Dome
Sky Explorer
Event Horizon

360 Video & Panoramic Photo Stitching

Multi-camera 360 video rigs require stitching all of the footage together. Panoramic photos too.

Autopano Video / Autopano
VideoStitch Vahana VR / PTgui
Mistika VR
Cara VR Plugin (Nuke)
Facebook Surround 360
Radeon Loom 360
Insta360 Pro Stitcher
Kodak Pixpro 360 Stitch
Mill Stitch

Presenting in the Dome

Giving a talk at a conference… in a dome? This will help.

Presenter360 – supports fisheye
Powerpoint Dome Template – supports fisheye

Pseudo Fisheye Lens Shaders (Plugins)

These Maya plugins distort the camera view into a circular shape, instead of capturing 180°. Not recommended if you want true scene immersion.

— JS_Fisheye / JS_PanoramicLinux & Mac
— Shaders_P
— MentalCore

Hacks for Faking Fisheye (After Effects)

Do you need to fake the distortion of a fisheye lens? There are a few built-in effects within After Effects which can do the job. But be aware that you cannot simply take any flat image and perfectly convert it to fisheye. That is impossible. But you can sometimes use these hacks to enhance the illusion and better warp the footage onto the dome. Most of these are variants of the ‘Spherize’ effect.


Photography: Fisheye Lens Research

This is by no means comprehensive, but instead meant to be a collection of interesting gems that I’ve stumbled across. This is NOT a list of suggested fisheye lenses to use for modern photography or video. I’m merely collecting interesting research and documentation of how fisheye lenses work.

The Fish List – fisheye lens catalogue
Fisheye optics explanation
Fisheye lens designs and their relative performance
Fisheye lens options for digital 4k projectors

Michel Thoby: Panorama Research
About the various fisheye projections of the photographic objective lenses
Creation and use of synthetic circular fisheye images
Introduction to the Least-Parallax Point (LPP) theory
Review of the Canon EF 8-15mm f4 Fisheye Zoom

More than 180 – by Warik Lawrance
Many fulldome producers adhere to the strong belief that only one type of lens should be used in a planetarium. That lens is a 180⁰ fisheye as it perfectly fits the shape of the dome and causes no distortion of the image. However, this is akin to a film producer or cinematographer only ever using a 50mm standard lens by reasoning that it perfectly fits what the human eye naturally sees. In reality, cinematographers use a wide variety and range of lens sizes to their stories because they are incredibly useful and powerful tools which assist the narrative. So what is the range of lens sizes that can work in the dome? What are the advantages and disadvantages of using different lens sizes? And how can they be used to the best effect?

History of fisheye lenses for photography
Nikon Fisheye – Nikkor: the full story from the origins to the present, with 9 prototypes including a 5.5 mm FoV 270°
Perimetar, Sphaerogon, Pleon: the definitive compendium about these super-wide fisheye lenses of the ’30s concieved by the firma Carl Zeiss Jena

Salvador-Dali-at-a-book-signing-by-Philippe-Halsman-1963Salvador Dali at a book signing, by Philippe Halsman, 1963

When All Else Fails… Hemicube Camera

Sometimes fisheye lens shaders don’t work in certain weird situations. Such as rendering certain types of particles, post render action, experimental shaders, and whatnot. In these cases you can fallback and create a hemicube multi-camera rig and then stitch the renders together into a fisheye.

Hacky Solution… Mirror Ball Camera

This is my least favorite option for rendering fisheye images. The idea is that you point the camera at a perfectly reflective sphere. So it requires that you have ray tracing turned on, and probably the amount of bounces raised. But the huge limitation is that your renders will not contain an alpha channel! And for me, that is simply unacceptable.

Background Stars v1

cropped-starglobe-maya-render(For an improved version of this star globe, please see Background Stars v2.)

Do you need night sky stars with accurate magnitudes and a good looking milky way for the background of your whole Maya scene? Do you want 360 degrees of coverage without any stitch marks, seams, or pole pinching? Well here is what I call the ‘star globe’. (Named out of ease for communicating with my team.)

Just import it, point-constrain it to your camera, and scale the star globe to surround your entire scene. This insures that the star globe will follow the cameras position but not its rotation. For a final render, we typically use preview quality settings and don’t have any aliasing/blinking of the stars. Fisheye camera typically at 500 focal length to shortcut the grey blurry line problem, but it depends how big you’ve scaled the star globe  (you’ll know easily if the stars look blurry and weird).

Also, make sure to check the light linking of the star globe so that there are no lights attached. This is because the star globe material has the texture set to incandescence. This insures that it will look always look the same without having to worry about other light sources accidentally brightening the stars.

maya scene – star globe


These stars are optimized to look like pinpoints of light on specifically a dome. But if your final screen is instead flat, well the stars might appear more like spheres of light. In this case I would suggest lowering the camera focal length. Therefore more stars will be in the shot and they will appear smaller. And if you don’t want to change the focal length for your shot, then create a render layer override instead.

The stars were originally rendered in Sky-Skan’s DigitalSky software. PTgui was then used to re-project the night sky images from circular fisheye to equirectangular (what I call ‘defishing’); this was needed for easy UV mapping of a sphere in Maya. We spent alot of time in the dome to optimize the look of the star pinpoints, color saturation, and halo size. So we now use these stars for all of our fulldome productions. Please remember that everyone has different standards of what a star background should look like and this matches our criteria after much discussion about what we liked and didn’t like from other fulldome shows. What looks great on a computer screen does not often transfer to the dome in same way; this was created to look crisp in the dome.

Note: If your renders are starting up slowly, then use Photoshop to convert each of the star globe source images into IFF’s (type: maya, not amiga). Then point the Maya texture nodes to the IFF’s instead. The reason for this is that IFF’s are memory mapped; meaning Maya will only load into memory the parts of the image that it actually needs to render. If you want even faster renders, convert the textures into Mental Ray MAP files.

Custom Maya Camera for Fulldome Production

Here is a custom Maya camera I’ve made for our fulldome productions. Just import and go! (Requires Domemaster3D)

maya scene – custom fulldome camera rig

— Uses an aim/up to point the camera. It is applied in a way that allows you to aim the camera and not worry about the camera Z rolling within 180° field of view.
— A hemicube camera rig is parented within fisheye camera (and hidden). Therefore it automatically uses the aim. So if you need to switch over and use hemicube cam for whatever reason, then it’s there and waiting for ya!
— Includes a dome visualization that isn’t selectable. It just lets you know where the camera is facing. It’s also helpful as a guide to see where the springline hits in a scene. I called it the “FYI”.
— Click the camera to see the custom attributes: FYI scale, FYI visibility, FYI Uni Helper, Cam Locator Scale, and Custom Roll. Includes the ability to scale the camera locator without affecting the cam scaleXYZ. (Since changing cam scaleXYZ changes how textures are placed onto surfaces.)
— Preset Domemaster3D settings. (This assumes you already have Domemaster3D installed). Check out this simple tutorial to apply the DomeAFL lens shader to a Maya camera.
— It’s by no means perfect and definitely has a gimbal-lock-camera-flip if you go past 180°, but it works well if you know its limits. Play around with it and you will quickly understand.