Interviews at IMERSA 2015 – Recent Challenges

The IMERSA Summit 2015 was intense, fascinating, and gave me a fresh breath of air of where fulldome is headed. Each of the presentations/panels were well prepared and technical problems were solved quickly. And the amount of attendees is just at that equilibrium where you can still meet a fair amount of people.

I ran into many people that are clearly passionate about fulldome and it was inspiring to hear their unique perspectives and experiences. So I realized that I should document a slice of these conversations with one interview question:
What challenges have you recently faced?

Please chime in and share your own recent challenges in the comments below.

IMERSA Summit 2015: Panels


We have been asked to participate on several panels at the upcoming IMERSA Summit 2015. So I’ll be apart of the Future Immersion Panel and share some of the techniques we used to shoot 360 video while on location at NASA’s Goddard Space Flight Center. Planetarium director David Rabkin will be involved in two panels: Immersive Media Strategies and Challenges and New Directions in Alternative Content.

I’m also excited to announce that From Dream to Discovery has been selected to be screened during IMERSA. It will be shown directly after the Future Immersion Panel, which is perfect since you’ll have just seen behind the scenes.

IMERSA Summit 2015 – Agenda

Thursday February 26, 1:30-2:30 pm
Producers’ Panel: Immersive Media Strategies and Challenges (prerendered fulldome production)
We’ve seen creative and excellent fulldome work in recent years. But we’re still at the beginning of our understanding of experiences made possible via the fulldome medium. With each new concept, we experience new challenges. So we’re early on a steep learning curve. To encourage us to keep moving along it, let’s take on several overlapping-yet-different viewpoints – educator, producer, artist, and visualizer – and explore what’s possible in the fulldome medium and ponder some of the creative and technical challenges that together we face.

Thursday February 26, 8:30-9:30 pm
Future Immersion Panel
From high-resolution camera rigs for live-action capture to innovative immersive spaces; from branded content to breakthroughs in science education programming, this session will give you a front row seat for where immersive media can go next.

Thursday February 26, 10:00 pm
Fulldome film screening – From Dream to Discovery: Inside NASA
Experience the excitement of today’s space missions as you journey from NASA’s test facilities all the way to Pluto. Immerse yourself in the adventure and extremes of spacecraft engineering – from the design of missions like the James Webb Space Telescope and New Horizons, to the rigors of testing, launch and space operations. When humans dare to dream we create truly amazing things.

Saturday February 28, 3:30 pm
New Directions in Alternative Content
Since fulldome is capable of displaying essentially any content, what are some alternate uses of the medium that theaters are using now, and what are other potential uses to delight and engage future audiences

David Rabkin (Planetarium Director) & Jason Fletcher (Science Visualizer)

Interactive 360 Video – NASA Goddard Space Flight Center


A while back we visited NASA’s Goddard Space Flight Center to shoot 360 video for our recently released planetarium show From Dream to Discovery. So when 360Heros approached us about writing an article to look behind the scenes, we realized that we could share an interactive video where you can control the perspective.

Watch the Interactive 360 Video

Experience it within a Web browser, iPhone, Android phone, Google Cardboard, and Oculus Rift.

If you’re curious to learn more about my experience in shooting 360 video while at Goddard, then be sure to check out the interview too. I share some details about why shooting 360 video for a planetarium dome requires a unique approach. You can also view the whole photo album from our trip to Goddard.

From Dream to Discovery: Inside NASA Engineering – Watch the Trailer

Experience the challenges of the next generation of space exploration in this brand-new Planetarium show. By using exciting real-life projects like NASA’s James Webb Space Telescope and the New Horizons mission to Pluto, the show highlights the extreme nature of spacecraft engineering and the life cycle of a space mission – from design and construction to the rigors of testing, launch, and operations. Blast off and take the voyage with us!


Its been a intense few months. Here are a few things we’ve accomplished:

  • Presented a planetarium workshop and teacher workshop.
  • Traveled to NASA’s Goddard Space Flight Center to shoot 360 video.
  • Had NASA engineer check-in’s to approve the science.
  • Created extra educational modules that dive deeper into select topics.
  • Fulfilled the formative/summative NASA review.
  • And this week we opened the show to the public!

We’ve finished the concentric version of the show and we’re now working on a unidirectional version which will be finalized for distribution in February 2015.

NASA Show – Sneak Peek

The blog has been quiet recently because we’ve been knee deep in a show production. So here is a quick look at whats been cookin’.

We received a NASA grant to create a planetarium show about spacecraft engineering and the challenges of designing, testing, and launching! Also we wanted to feature real imagery, so we traveled to NASA Goddard and shot 360° video at their expansive test facilities.

From Dream to Discovery: Inside NASA Engineering
Available for distribution in early 2015

Glimpse Behind the Scenes

Below are some in-progress scenes from the actual fulldome show. We are currently in the polishing stage and we’re working to add the final details. I can’t wait to share more…

IPS-Macao Fulldome Festival: SELECTED!

Our show Moons: Worlds of Mystery has be selected to be a part of the competition screening during the IPS-Macao International Fulldome Festival 2014! So if you’re attending the IPS Conference, then be sure to mark your schedule.

After the festival, we have the added honor of being one of only six productions to continue screening for the public in Macao (6/21 – 7/31) and Beijing (6/28 to 7/31).

Stills From Moons: Worlds of Mystery

News: IX Symposium, Cinefex on Fulldome, 8k Research, Illustris Simulation


IX: Symposium on Immersive Creativity

In just a few days this fascinating and first-time conference will begin in Montreal and last for 5 days. Last year I saw the SAT team perform at Immersa 2013, and they were already doing come cutting edge work. So I’m curious to see how this event kicks off.

The IX Symposium will provide an open exchange platform through a series of keynotes, panels, demonstrations, performances and immersive productions. An objective of the symposium is to help democratize access to immersive spaces and to the tools and processes used for the creation of original contents. Another objective is to foster a network facilitating the creation of a community and the circulation of people, ideas, and works. This will help rethink the models of production, the formats, the delivery systems, and the creative processes needed to maintain and nurture an international platform for strong artistic expressions and open innovation.

Shortcuts: An Experimental Dance Film Competition

We are looking for videos, who look at dance through the new medium of 3-dimensional/spherical cinematography. Dance and an experimental approach should be at the center of the work. Focal point of interest should be real filmed scenes and not computer generated and/or animated film sequences.


Cinefex: Fulldome – Films in the Round

For Cinefex to write an article exclusively about fulldome, well thats a exciting sign of the times and simply cannot be ignored. Its refreshing to read this thorough overview which speaks to the the particular nuances of fulldome production. Such as the need to be a CG generalist, slow camera movements, cross-bounce worries, and using the Oculus Rift as a dome preview tool. And I can definitely agree when I read “Everything we do ends up on the dome”. Its true, to work under a tight timescale you must use basically use everything you render. By the time you render at 4k, you already know the edit and feeling of the show; so you’re just finalizing the visuals at that point. There is no post-production edit stage, as that instead happens during the animatic process since working at 4k is not forgiving. Truly an exciting article and perhaps now the CG community will have its eyes wide in awe.

Fulldome 8k Research – Is There Any Point?

An in-depth and well researched thought experiment into 8k production concerns. And then to push the idea further, they theorize the max possible retinal resolution in the dome. What a fascinating write-up of the many aspects which I have been curious about. Anyone in fulldome production must read this. Although I’m not quite sure how to hybridize the display resolution research of Paul Bourke into this article.

An Initial Look at the Oculus Rift HD for Fulldome Previewing

The Oculus Rift is starting to become an interesting way to preview fulldome dailies and watch shows when you don’t have immediate access to a dome. Aaron had early access to an HD prototype and has figured out the best domemaster input for the optimal experience.


Illustris: New Universe Simulation
Just one week ago, the Illustris simulation was announced to the world and the dataset has been opened up to outsiders. The simulation is simply stunning, as it is able to simulate the galaxies themselves moving within the cosmic web. Around 1 minute in the video is where the gas temperatures begins to cascade and I’ve never seen anything like it. A cosmic scale timelapse. And I expect we will see on dome sometime soon… Now freely available as 4k and 8k domemasters through the ESO. Oculus Rift stereo render also available (bottom of the page).

Planetary – Documentary Trailer
While not created for fulldome, how can I ignore such a beautiful film which speaks directly to something so close to my heart? It is this exact caliber of work that I wish to see in the dome. A expression of the shared struggle to explore the science collectively and combine it with the important but often ignored emotions of our daily struggles in these wild times.

“Planetary is a feature length documentary that expands on the mind-blowing perspective shared by astronauts and indigenous elders alike — that each and every one of us is inseparable from each other, the planet and the universe as a whole, a realisation that is both humbling and comforting. If more of us start to experience the simple truth of our interdependence and connection, we believe there is hope for the next stage in our human journey. Planetary is meant to inspire awe and wonder by seeing things as they are, placing our individual stories in the context of the larger planetary / cosmic story.”

Crossing Worlds – Fulldome Short Film
A visual tone poem designed for the emerging fulldome planetarium format, Crossing Worlds utilizes spherical photography from the American desert west to immerse the viewer in a transcendent spectrum of austere landscapes. “Crossing Worlds served to remind audiences of how extraordinary our home planet is. It was the perfect complement to the data-driven visualizations of global changes, deepening appreciation for the critical importance of the natural world for sustaining us emotionally as well as physically” said David McConville, co-founder of The Elumenati.

Fiske Fulldome Basics
Its difficult to explain to the public the resolutions which modern fulldome can achieve. For instance, when most people hear of 4k they think of the TV standard. So it can be tricky to explain the public that 4k refers to something different in the fulldome standard. This video is a wonderful and simple explanation of the differences between 1080 broadcast television to 8k fulldome.

Waiting Far Away – Fulldome Short

“An explorer of the cosmos has traveled too far… And can’t find home.”

In the creative process of producing planetarium shows, we often come across imagery that is stunning but doesn’t work in the context of a science show. And so our collection of fulldome astronomy art animations has matured into a hybrid form of storytelling where we mix imagination with real data.

What particularly excites us about backburner short projects, is that we produce them quickly. Its refreshing to just have a clear vision and then immediately crank it out. In the future we plan to continue creating fulldome shorts. Our next short will have a bleeding edge science topic to focus on. But we have other ideas for experiences in outer space that don’t fit your typical science show. More to come!

Domemasters Freely Available

  • Available for planetarium use. Contact me and I’ll give you an address to ship an external hard drive. Please include a SASE or IRC.
  • 4k domemaster frames, 30fps, 5.1 & stereo audio, (200GB).
  • 1k Quicktime MOV available for download by request (2.6GB).

Terms: permission to freely screen to the public in planetariums as you see fit. You must screen the short in full and unedited. Not to be used in other shows without permission.


Wish to translate and record the narration into a language that better suits your audience? Let me know and I can share the music-only 5.1 surround mix.

Languages Available
— Spanish
— German
— French
— Italian
— Polish
— British English
— Portuguese
— Russian
— Korean
— Telugu


Conferences & Festivals
— Jena Fulldome Festival 2014 (Jena, Germany)
— 2014 European Symposium of Planetariums (Lucerne, Switzerland)

International Planetariums
— Macao Science Center Planetarium (Macao, China)
— Planetarium Wolfsburg (Wolfsburg, Germany)
— Sri Sathya Sai Space Theatre (Puttaparthi, India)
— Baikonur Planetarium (Poland)
— Il Planetario (Bologna, Italy)
— Orbit Night Sky Planetarium (Kolkata, India)
— Roi-Et Planetarium, Science and Cultural Center for Education (Phitsanulok, Thailand)
— Discovery Dome Europe (Birmingham, UK)
— Planetarium Toruń (Torun, Poland)
— Astropokaz Planetarium (Poland)
— Fireball Planetarium (Newfoundland and Labrador, Canada)
— Planetario Digital Carl Sagan (Parana, Argentina)
— Tusi-Bohm Planetarium (Baku, Azerbaijan)
— Quasar Planetarium (Olkusz, Poland)
— Planetarium and Observatory of Cà del Monte (Cecima, Italy)
— Grupo Astronomico Silos (Zaragoza, Spain)
— The Heavens of Copernicus Planetarium, Copernicus Science Centre (Warsaw, Poland)
— Portable Planetarium (India)
— Portable Planetarium (France)
— Cosmos Planetarium (Scotland)
— Planetarium RCRE (Opole, Poland)
— Museum of Paleontology Egidio Feruglio, Portable Planetarium (Trelew, Argentina)
— Planetario de Bogotá (Bogotá, Colombia)
— Eugenides Planetarium (Athens, Greece)
— Portable Planetarium (Natal, Brazil)
— Portable Planetarium (Brazil)
— Portable Planetarium (Bolivia, South America)
— Sternwarte und Planetarium der Stadt Radebeul (Radebeul, Germany)
— Portable Planetarium (India)
— Portable Planetarium (Brazil)
— Portable Planetarium (Sussex, England)
— Portable Planetarium (Russia)
— Planetarium de Bretagne (Pleumeur-Bodou, France)
— Illusion Planetarium (Tyumen, Russia)
— Strasbourg Planetarium, Université de Strasbourg (Strasbourg, France)
— Astronomy Club of Feira de Santana, Antares Observatory (Bahia, Brazil)
— Portable Planetarium (Poland, Katowice)
— Planetarium of Nantes (Nantes, France)
— Planetarium Hamburg (Hamburg, Germany)
— Portable Planetarium (Matera, Italy)
— Immersive Vision Theatre, Plymouth University (Plymouth, UK)
— Portable Planetarium (Valencia, Spain)
— Digital Mobile Planetarium Wenu Mapu (Rio Negro, Argentina)
— Planetarium Sultan Iskandar (Sarawak, Malaysia)
— ESO Supernova planetarium (Garching, Germany)
— Planetarium Minikosmos (Lichtenstein, Germany)
— Planetarium Man (Northampton, UK)
— Guru Graha Planetarium (Andhra Pradesh, India)
— Zeiss-Planetarium Jena (Jena, Germany)
— Techmania Science Center, Planetarium (Pilsen, Czech Republic)
— Planetarium Photon (Nizhny Tagil, Russia)
— Hvězdárna a Planetárium Brno (Brno, Czech Republic)
— Metaspace Planetarium (Seoul, Korea)
— Portable Planetarium (Cape Town, South Africa)
— Astronomisches Rechen-Institut, Portable Planetarium (Heidelberg, Germany)
— Portable Planetarium (Vic-la-Gardiole, France)
— Planetarium of Occhiobello (Santa Maria Maddalena, Italy)
— Portable Planetarium (Saskatoon, Canada)
— Planetarium Espacio 0.42 (Huesca, Spain)
— Banff National Park, Portable Planetarium (Banff, Canada)
— Portable Planetarium (Vietnam)
— Herne Observatory (Herne, Germany)

USA Planetariums
— East Village Planetarium, The Girls Club (New York, NY)
— Hokulani Imaginarium, Windward Community College (Kaneohe, Hawaii)
— Daniel M. Soref Planetarium, Milwaukee Public Museum (Milwaukee, WI)
— Dr. Sandra F. Pritchard Mather Planetarium, West Chester University (West Chester, PA)
— University of Alaska Anchorage Planetarium (Anchorage, Alaska)
— Ward Beecher Planetarium, Youngstown State University (Youngstown, OH)
— Boonshoft Museum of Discovery Planetarium (Dayton, OH)
— Acheson Planetarium, Cranbrook Institute of Science (Bloomfield Hills, MI)
— Neag Planetarium, Reading Public Museum (Reading, PA)
— Longway Planetarium, Sloan Museum (Flint, MI)
— University of Texas at Arlington Planetarium (Arlington, TX)
— Rollins Planetarium, Young Harris College (Young Harris, GA)
— Wausau School District Planetarium (Wausau, WI)
— Sudekum Planetarium, Adventure Science Center (Nashville, TN)
— Raritan Valley Community College Planetarium (Somerville, NJ)
— Delta College Planetarium (Bay City, MI)
— Buhl Planetarium, Carnegie Science Center (Pittsburgh, PA)
— Maynard F. Jordan Planetarium, The University of Maine (Orono, ME)
— Obscura Digital (San Francisco, CA)
— Peoria Riverfront Museum, Planetarium (Peoria, IL)
— Glastonbury Planetarium (Glastonbury, CT)
— Edelman Planetarium, Rowan University (Glassboro, NJ)
— Harlan Rowe Middle School Planetarium (Athens, PA)
— Hallstrom Planetarium, Indian River State College (Fort Pierce, FL)
— Starlab Portable Planetarium (Massillon, OH)
— Charles W. Brown Planetarium, Ball State University (Muncie, IN)
— Flandrau Science Center & Planetarium, The University of Arizona (Tucson, AZ)
— Dreyfuss Planetarium, Newark Museum (Newark, NJ)
— Clark Planetarium (Salt Lake City, UT)
— Downing Planetarium, Fresno State (Fresno, CA)
— Gates Planetarium, Denver Museum of Nature and Science (Denver, CO)
— Ask Jeeves Planetarium, Chabot Space & Science Center (Oakland, CA)
— Truman State University Planetarium (Kirksville, MO)
— SUNY Oneonta Planetarium (Oneonta, NY)
— Portable Planetarium (Antioch, CA)
— Arthur Storer Planetarium (Prince Frederick, MD)
— WVU Planetarium, West Virginia University (Morgantown, WV)
— Adler Planetarium (Chicago, IL)
— York Learning Center Planetarium, York County Astronomical Society (North York, PA)
— COSI Planetarium (Columbus, OH)
— FLHS Planetarium (Fair Lawn, NJ)
— Austin Mobile Planetarium (Austin, TX)
— Argus IMRA Planetarium (Ann Arbor, MI)
— Anchorage Museum, Planetarium (Anchorage, AK)
— Calusa Nature Center & Planetarium (Fort Myers, FL)
— Northside ISD Planetarium (San Antonio, TX)
— SMSU Planetarium (Marshall, MN)
— Roberson Museum and Science Center (Binghamton, NY)
— West Virginia University Planetarium (Morgantown, WV)
— University of Michigan, Museum of Natural History, Planetarium (Ann Arbor, MI)
— The Children’s Museum (West Hartford, CT)

— Sky-Skan, Europe
— Sky-Skan, USA
— Spitz
— ZEISS Powerdomes
— Digitalis Education Solutions
— Fulldome Film Society
— Discovery Dome, Europe
— EnterIdeas
— Emerald Digital Planetariums
— Lochness Productions
— LSS Planetariums Open Project
— Metaspace

Visuals by Wade Sylvester & Jason Fletcher – Music & Narration by Wade Sylvester
Charles Hayden Planetarium, Museum of Science. Copyright 2014. All Rights Reserved.

Fulldome Interviews: Shane Mecklenburger – Art/Science Course at OSU


Teaching 3D animation is no simple task. Especially if the center of interest is astronomy and fulldome! So in September 2013, I was asked to teach my astronomy visualization techniques as a visiting artist at OSU.

Shane Mecklenburger is the 3D animation professor of a class called “The End & the Beginning of Everything”, which received a Battelle Endowment. Participating OSU astronomers and astrophysicists paired with the animation students to create a dialogue and help inspire their artwork.

Its interesting to note that Shane wanted to be careful to not focus on purely data visualization. Instead he wanted the students to use the techniques to make their own artwork inspired by real astronomy and astrophysics research. This allowed the students the creative freedom to go in any direction, with the foundation of astronomy to work from. Since I’ve spent every working day for years creating outer space imagery, its was a natural fit to give a few technical workshops.

The work of the students will be exhibited at the Adler Planetarium in Chicago sometime in 2014 or 2015. They have also been experimenting with fisheye rendering and viewing them in the OSU Smith Laboratory Planetarium, which was recently refurbished with a 2.6k projection system.

In the first workshop we covered star fields, galaxy fields, and close-up fluid suns. Then in the second workshop we covered 3D nebulae, hall of mirrors, and DomeAFL installation. It was a wonderful experience to share my techniques with students and see them learn so quickly.

Class Description

The End & the Beginning of Everything is a collaborative art-science initiative between the OSU Departments of Art and Astronomy, the University of Chicago Department of Astrophysics, Chicago’s Adler Planetarium and the Advanced Computing Center for the Arts and Design (ACCAD). Accelerating technologies are amplifying astronomers’ ability to model and observe, expanding our understanding of life and the universe. This initiative guides young artists in creatively interpreting astronomical research for a public contemporary art exhibition.


Interview with Shane Mecklenburger:
3D Animation Professor at OSU

Over the years, I’ve noticed a gradual increase in the offerings of college classes that focus on astronomy art or fulldome. So I’d like to interview Shane and hear his thoughts about this experience.

Can you share a little about yourself as an artist and teacher?
Lately my projects are exploring exchange and simulation. I’m also interested in the origins and effects of science and technology, which are often tangled up with real or imagined conflict. I teach courses on cross-disciplinary collaboration and new media art in the Art & Technology area in the Department of Art at The Ohio State University. One of my courses is Experimental 3D Animation, and another is called ArtGames, which is about using games as a format for making artwork. Also, this semester I’m teaching a new graduate level seminar I developed called APPROACHING SYSTEMS.

What inspired you to create a class combining 3D animation and astronomy?
I was working on a project called The End & The Beginning of Everything, where I’m collaborating with astronomers who make simulations of supernova explosions. It occurred to me this would be fun to do with my Experimental 3D Animation class, so I wrote a grant to connect them with astronomers at OSU to make artworks based on their research.

You mentioned to me that visualizing orbiting planets was a default starting point for the students. Once you were able to break the students of that, how have their perspectives changed in terms of what to visualization? Was there a tipping point?
I guess the tipping point was when I banned spheres from their animations! I did it after the first project, as a challenge to get them thinking in more abstract ways. The point of the course is to make video artworks that aren’t simply science visualizations. So the course is meant to develop the non-literal, symbolic dimension of students’ art making abilities while learning about astronomy.

Do you think this class has changed how the students perceive astronomy?
I suspect it might have, to differing degrees depending on the student. It’s hard not to come away with a new perspective after spending time at an actual astronomy research facility, talking to astronomers, and coming to understand the phenomena and theories they’re researching. I know it happened for me.

What has been challenging in teaching this class?
Just about everything! Many of the students had no experience with 3D animation and we were working with very advanced animation techniques like dynamics and fluid simulations. We were also making sound tracks, and using a render farm to produce fulldome masters, so the technical hurdles were extreme, and I was learning most of the fulldome techniques myself on the fly. Dealing with these kinds of problems are what make teaching fun for me, and I hope they also demonstrate for students how complicated and unpredictable this kind of work can be.

What do you hope the students will take from the course?
Three things: First, I hope they come away with artwork they’re proud of. I also hope students emerge with a better sense of their own unique creative approach, style and voice. Finally, I hope they come away with a sense of the special challenges and importance of cross-disciplinary collaboration.

Do you think a fulldome specific class at OSU has merit?
Absolutely. As a pilot/test, I feel this course has demonstrated that, while there are still technical challenges to sort out, the interest is there from both students and faculty.

Student Artworks

360° Video Fundamentals

Check out the Interactive 360 Video

I have always been excited at the possibility of 4k video in a planetarium dome. And so I was captivated with the recent introduction of a 360° video camera rig with 8x4k resolution. (Which translates to 4k domemaster resolution.) It also meant that I could increase the fisheye FOV from 180° to 220° and see the immediate ground surrounding the camera. In my opinion this makes for a heightened immersion experience. So I have spent the last two months experimenting and learning directly about the intricacies of shooting 360° video.

The 360Heros H3pro10HD is a 3D printed object. Meaning its one solid piece of plastic that is precisely engineered to fit 10 GoPro cameras into the smallest possible space. Its printed using aircraft grade plastic, so its durable and has been through a strenuous bend test to prove its strength over time.

Currently the 360° video community is tiny and little documentation is available. So I was on my own to figure out the potential problems, shooting subtleties, and overall workflow. This can be a tedious and nerve-wracking process. After all, with 10 GoPro cameras shooting in unison, something is bound to go wrong at some point. So alas, plan within plans within plans, theorize contingencies, and take notes of your experience. And now for you brave souls remaining, below are my own findings, tips, and thoughts.

Hardware Rundown

Camera & Memory
— H3Pro10HD Rig
— 360Heros Aluminium Mount
— GoPro HERO3+ Camera – Black Edition (x10)
— Lexar 32GB microSDHC 600x [LSDMI32GBSBNA600R] (x10)
— Manfrotto 496RC2 Ball Head with Quick Release
— Headless Tripod
— Manfrotto 200PL-38 RC2 Plate

Extra Batteries & Charging
— Wasabi Power Battery [2-Pack] and Charger (x10)
— PowerSquid D080B-008K (x2)
— 7-Port USB Charger (x2)

— Pelican 1550 Case with Foam
— Fotodiox GT-H3Lens-Cap GoTough (x10)

Stitching & Rendering
— Autopano Giga & Autopano Video Pro [known as AVP in this blog post]
— GeForce GTX 770 4GB
— 360Heros File Data Manager

Big Picture Workflow

There are different mindsets and considerations depending of what action you are performing. So I’ve split up up this information into 4 main sections.

Setup the Rig

  1. Put camera’s into rig (matching camera # to rig #)
  2. Press wifi-button on each camera
  3. Check that each camera’s recording settings are correct: Example Display

Its a bit of a trick to get all the GoPro cameras installed without touching the lenses. But having the plastic rig locked onto a tripod helps. I chose to permanently install the aluminium mount so that 7 of the GoPros are looking at the horizon. Since I shoot for a dome, this can ease stitching pains later on since often the objects of interest are in these camera zones. This leaves 1 camera pointing straight up and 2 pointing down at the tripod. Interestingly, if a camera fails then I can trade it out for 9 or 10 and still not lose any essential zones. So there is some redundancy there. I could also remove 9 & 10 altogether and still capture 280° of footage.


Also, removing the cameras from the rig is difficult due to the tightness of the plastic camera clamps. But this isn’t a complaint, its actually a good sign that everything is snug and won’t wobble during shooting. So to remove the cameras you must use a small flathead screwdriver (included with 360Heros rig) to softly wedge the plastic tab up from its locked position.

Its important to number each of the cameras and the memory cards with a sharpie. If a camera begins to act strangely then I can easily trace it back to the source. It also allows me to understand exactly where a camera is within the rig, which can be helpful for when the automatic stitching doesn’t work. This is rare, but can happen if the video frame is of a pure blue sky with no overlapping content to match up.

The first time you setup the cameras, you will need to setup the default video recording settings for each camera by hand. From that point forward the cameras will remember those settings. It is absolutely vital that each camera has the exact same settings.

Camera Settings for Best Resolution
— Protune: On
— Resolution: 2704×1524
— Aspect Ratio: 16:9
— FPS: 30
— Low Light: Off
— White Balance: Cam RAW


Shooting Workflow

  1. Power ON the cameras with the wifi remote
  2. Check each camera for corrupt symbol (rare)
  3. Press Record on wifi remote
  4. Quickly check each camera for red blinking recording lights
  5. Motion sync the cameras. Twist tripod stem quickly. VERY IMPORTANT!

Whats the need for motion sync? Well the wifi remote doesn’t trigger all the cameras at the same moment. So instead AVP must sync each of the video clips itself, otherwise your footage could be offset on the timeline and look terrible. There are two options: audio sync or motion sync; but motion sync is more precise. All you have to do is quickly spin the whole rig back and forth to provide the necessary data calibration for AVP to use later. The audio sync needs a sharp fast snap to sync to, like a dog training clicker.

A wonderful aspect of using GoPro cameras is that one wifi remote can trigger all 10 cameras. So they can be remotely powered ON/OFF and record start/stop. The remote indicates how many cameras are connected.

Because each of the camera lenses are not at the same nodal point, you’re undoubtedly going to see parallax errors when stitched. There are some techniques to address this in AVP, but it’ll take time and hand polishing. But to combat this upfront, its wise to constantly be thinking about parallax when planning a shot. My suggestion is to keep the camera rig at least 10 feet away from everything. That can be tricky and not always an option, but its the guideline I try my best to follow. Know that the larger the environment that you shoot, the easier your stitch job will be.

The most difficult aspect of shooting for a dome is keeping the camera movement buttery smooth. Any subtle bumps in the footage will be greatly amplified on the dome. I tested a wheelchair and the motion was smooth but not perfectly parallel when moving forward. I also tested a motorized wheelchair and that produced satisfying results when used at the slowest setting. But the real perfection is with the use of a tracked dolly. Yet then you run into the issue of seeing the track within the 360° shot. So I used the Singleman Indie-Dolly with 12 feet of track and that was just short enough to not show within 240°-ish of the footage. Its incredibly smooth and the smallest movement forward looks stunning in the dome.


Perhaps the most frustrating aspect are the GoPro cameras themselves. GoPro’s are not exactly known for their dependability. Some things to watch out for:

  • Camera battery life leaves me wishing for more. 1h 30m of shooting time is pretty good, but I haven’t timed it when continually turning ON/OFF the whole rig. So I have a total of 30 batteries charged and ready. I definitely turn off the cameras when not recording, as they eat battery life when in stand-by mode. I’ve heard that changing the settings from the default 3 red blinking recording lights and instead seeing only one red blinking recording light can help to save some battery life too. Leaving on the wifi overnight (blue blinking lights) will drain a battery life in 24 hours.
  • Be sure to use microSD cards that are recommended by GoPro. Some of the other brands don’t write as fast as they advertise and this can lead to the camera stopping recording randomly due to a full buffer. I’ve had good results with the Lexar 32GB microSDHC 600x and the footage files have a constant data stream of 45mbits/sec, with no skipped frames. But in the future I’d suggest the 64GB version. Recording with 2.7k settings will allow you to record 1h 30m of footage. So using a 64GB card would double that. Dumping footage takes about 1h 30m of diligent work, so if you’re in the middle of a big shoot, the last thing you want to do is be interrupted. I’ve had absolutely no problems with the microSD cards. But if you have an “SD error” on the GoPro, accidentally deleted footage, or the microSD card is unreadable, then check out Test Disk.
  • Occasionally a video file will corrupt. You’ll know because the camera display will show the corrupt symbol and you must push a button to continue. So this is something that I’ve added into the shooting workflow. Its not entirely clear what causes this, other than the fact that these high resolutions have a high data rate and sometimes the header doesn’t get written to file. Luckily the video data can be repaired when transferred to a computer. I’ve yet to have a case where the video data is lost.
  • The cameras are not up to par when it comes to long duration shots. A continuous shot of 30 minutes would make me nervous. I had one occasion where a camera froze; meaning the firmware froze up and the battery had to be removed to reset it. And you’ll know its frozen, as no buttons will function. Either it won’t power on, won’t begin recording, or stops recording mid-shot. So as a precaution, I am always checking for the red blinking recording lights at the beginning and ending of a shot. Upon freezing, the red recording lights stop blinking and then it is no longer recording video. But any footage prior to the freeze is retained (though it might be corrupted).
  • If the footage count on the front camera display isn’t identical between all of the camera’s, then that camera wasn’t recording during the shot and so that channel will be missing for stitching. Either one camera didn’t trigger via the wifi remote or the camera firmware has froze. So again, always check for the red blinking recording lights at the beginning and ending of a shot. Its obviously wise to keep notes in your shot log of these mishaps.
  • Even with all that said, they perform pretty well. For example, out of 57 total shots: 4 shots had corrupted files which were all later repaired in full. 1 essential camera had a battery die and lost half the shot. Twice a camera wouldn’t sync with the remote, but it was a non-essential camera facing the tripod and recorded fine without it (it needed a battery reset). And once an essential camera froze up after I had started recording and the shot is unusable. So obviously, get multiple takes of each shot!

Shooting for a dome means that I don’t actually need the complete 360° of footage. So I use a tripod to increase stabilization. Some people use a monopod that has tiny tripod legs. That is useful to capture as much FOV as possible and actually opens up the possibility of complete 360° with some trickery in post-production to hide the monopod itself. But in moving shots, a tripod is required.

Keep a shot log while shooting on location! It will be invaluable since you will undoubtedly forget the shot locations, problems, noted bumps/wobbles, bad takes, reminders, and such. Be diligent about this, as having a shot log will make your importing process much smoother.

Each of the cameras are locked to auto-expose. And I mean permanently locked off from user control. It’s just the way GoPros have always been. But luckily AVP is extremely powerful in its exposure/color correction. Yet it can be difficult to deal with when you’re very close to a bright light source and only one camera has a wildly different exposure value. Even still, AVP can correct between pretty serious exposure differences since the GoPro cameras have excellent latitude.

An exciting option is to shoot in slow motion at 240 FPS. Using this FPS will reduce your end output to 1k fisheye (2x1k equirectangular). Or shoot at 120 FPS and output to 2k fisheye (4x2k equirectangular).

The bottom of the 360Heros aluminum mount has an 3/8″ screw hole. (Contrary to the typical 1/4″.) So you will need to get a special quick release plate to put it on any tripod.

Battery Runtime Test
— Official battery, using wifi remote, recorded 2.7k footage for 1h 32m
— Wasabi battery, using wifi remote, recorded 2.7k footage for 1h 19m

Import & Check the Footage

Wherever you shoot, you’re going to need a laptop and external hard drive to dump the video data. For each hour of video taken, you will need about 340GB of free space.

Using an external USB 3 microSD memory card reader is an absolute requirement. Since each card can hold 32GB, you’ll want to be able to dump the data quickly. So dumping the files straight from the GoPro cameras isn’t an option since the USB 2 speeds are too slow. Even with USB 3 speeds, it still takes about 1h 30m to dump all the micoSD cards to disk.

Since each camera creates footage with similar filenames, its imperative that the you rename the videos files as they are copied to your computer. Otherwise the possibility of overwriting data is quite possible. But renaming by hand isn’t needed if you use the 360Heros File Data Manager to easily dump the video files for you. I underestimated the usefulness of this command line tool. It automates the process of appending the camera number and project name to the video filename, and then it dumps the footage into a folder. Here are the quick instructions of its use. Don’t worry, although it doesn’t have a GUI, all you need to know is how to CD via the command line prompt.

Finally its time to see how the footage actually turned out. Hopefully you have kept a shot log to help inform this process. But I always create a spreadsheet and keep track of the following per shot:

  • Concatenate needed? Due to the FAT32 memory card format, any shot that reaches 3.8GB will be automatically split into separate video files. (Or with 2.7k settings: every 11m 38s)
  • Corrupted video file? Infrequently a video file will corrupt; meaning the header won’t write to file. Luckily the video footage can be repaired.
  • Missing channels? Rarely a GoPro camera will freeze up. Either it won’t start recording or freeze up during a shot. Hence the need to check for the blinking record light in the beginning and ending of a shot.
  • Shot length match between cameras? If a camera battery dies in the middle of a shot, then you’ll still get the footage up to a certain point for that camera. But obviously the rest of the cameras will continue recording.
  • Shots organized? Each collection of 10 related video files will need to be placed into its own file folder to best keep individual shots organized for stitching. So from the huge list of dumped files you must figure out which cameras are related to one particular shot. This is definitely tedious and especially confusing with files that need to be concatenated. My shortcut is to organize the files by Date Modified and have the Length attribute visible.

Stitch & Render

Finally you have all your shots organized and ready to stitch in AVP. What’s amazing is that you don’t need a AVP template for the specific camera rig layout. Just drop all the videos into into AVP, synchronize, and stitch. It uses the overlapping content between each video to automatically create a 360° projection. Its not always perfect, but its damn good most of the time.

Here are a few sources that were helpful in learning Autopano Video.
Autopano Video Wiki Documentation
Autopano Video 101
Autopano Video Forums
360Heros FAQ

Final Render Resolution
AVP can render whatever type of projection you need to a frame sequence. Full resolution examples below. (These are not polished renders, just the initial automatic AVP stitch.)
— Equirectangular (Spherical): 8192x4096px
— Fisheye (FOV 180°): 4096x4096px
— Fisheye (FOV 220°): 5000x5000px

I was happily surprised with the fact that upon increasing the FOV to 220°, then the final render resolution increases to 5k. This is possible because more footage is viewable at a higher FOV, and so you are effectively “squishing” more pixels into the fisheye image. So when I project it onto the dome at 4k, then I have the added benefit of increased sharpness since the video has been downscaled.

I render using the Equirectangular (Spherical) projection because often I want to add slow movement to the virtual camera in After Effects with the Navegar Fulldome Plugin.

The only difference between Autopano Video and Autopano Video Pro is the GPU render option. Which makes a huge difference in render times. When rendering at these high resolutions, you need a graphics card that has lots of memory. So the GeForce GTX 770 4GB has a decent amount of cores, lots of RAM, and at a good price point.

Final Render Statistics at Full Resolution (Equirectangular)
— CPU: 0.05 render fps – takes 12 hours to render 1m 30s of video.
— GPU: 1.5 render fps – takes 1 hour to render 1m 30s of video. (GeForce GTX 770)

Right when you get the footage, you’ll just want to see a 1k preview quickly. The good news is that fisheye rendering at 1k in AVP is quite fast at 10fps on the GPU.

As of AVP 1.6, they have implemented a stabilization algorithm to help smooth out bumps. It definitely reduces any bumps by 50% and I read that will be improved in the next few versions. They are also adding a timeline which can keyframe cameras and color; which means that you can interpolate between stitches. That is powerful when trying to fix parallax on a moving dolly shot. Its also very helpful when trying to fix an unruly camera whose exposure is way off from the others (such as a camera pointing at the ceiling).

When using GoPro cameras, the stitching can be optimized since the camera sensors aren’t perfectly aligned/centered with the lens and therefore each have their own lens distortion model. So you can pass AVP this information in the the Lens Distortion Correction settings. As of version 1.6 the settings have been moved and updated.

The video and render examples showcased in this blog post do not yet have their stitches hand polished in AVP. They only use the initial automatic stitch because I just wanted to see the results quickly. So any obvious seams and parallax errors you see could be fixed with some hand polishing. I’m still learning how to best use the software. But you need to run alot of experiments to get polished results and possibly even composite multiple AVP renders in After Effects to get the most seamless results.

The GoPro cameras do a great job of capturing color and have excellent exposure latitude. But a photo filter in After Effects is often needed to color correct for tungsten lights or daylight. Then the colors need a big saturation boost to look optimal in the dome. And the use of curves can enhance dark colors instead of using contrast. And maybe a slight touch of sharpening.

This AVP explanation could easily be a whole write-up of its own. Perhaps in the future I’ll share my experience once I’ve learned more. I’m currently in the process of stitching 57 shots (900GB total unprocessed) taken at the NASA Goddard Space Flight Center. It was an intense 3 day shoot and we got some amazing 360° shots that I can’t wait for you to see in our upcoming planetarium show production.

Lastly, if you’re shooting with the stereoscopic 360heros rig, then you will have a slightly different workflow in AVP. I don’t even want to fathom the bizarre difficulties with 3D 360 video.

Both the normal and stereoscopic 360 videos can easily be experienced on the Oculus Rift (using Kolor Eyes, which is free).

And that’s the Gist of it

As you can see its not the faint of heart, as it demands a pretty wide range of technical knowledge. But the results can be simply stunning. This is a newborn medium and there is alot of room to grow. But its an exciting time for fulldome show producers, hyperdome possibilities, and the Oculus Rift community. 4k video in the dome has long been a dream and I think this is an bold step into a fascinating frontier.


Update: March 27, 2015 – Kolor has created a video explaining the complete 360 video workflow. It covers everything from camera setup, shooting, stitching, and rendering. It is an introduction to much of the necessary knowledge for creating 360 videos.

Update: February 19, 2015MewPro is an Arduino Pro Mini board which can reprogram/control almost all of the GoPro3+ functionalities. The Orangkucing Lab has been experimenting with an interesting solution for automatically syncing GoPro cameras. By hooking the MewPro to Gopro Dual Hero System, you can genlock multiple cameras. (Genlocking is a technique where the video output of one source is used to synchronize other video sources together.) So the jello effect might soon be a thing of the past. The jello effect happens when there is a sudden bump or vibration and since the GoPro cameras aren’t strictly hardware synced then each camera is recording a slightly different moment in time. So to simplify it, the FPS isn’t globally locked. And when you later stitch it all together you will see briefly see the seams between the video sources during the moments of bumps/vibration. Hence each of the video sources look like jello. And up till now this hasn’t been addressable.

Update: November 21, 2014 – Looking for a simple description of 360 video and how it works? David Rabkin and I wrote a paper that was presented at the 2014 GLPA conference: video of the presentation PDF paper. So if you want the basics without getting too technical, then this is your best bet. Full cost assessments: H3Pro10HD or H3pro7.

Update: November 20, 2014 – Here is an interesting roundup of 4k video cameras in the marketplace. So it seems that 360 video is still the best option for creating 4k domemasters, especially since 4k video cameras still don’t have sensors with at least a 4k vertical. So we are still a number of years away from 8k video cameras being mainstream. Yet its quite interesting to see 4k become a standard within video camera equipment.

Update: October 30, 2014 – If you are wondering about the new GoPro Hero4 camera… Having each camera shoot at 4k 30fps is amazing, but it comes at a cost. Results from the 360 video community suggest that you can only shoot for 30 minutes before needing to cool down the cameras. And sometimes the Hero4 camera will forcefully shut down due to overheating.

Update: March 12, 2014 – The GoPro cameras have a firmware update available that allows you to shift the automatic exposure through the iPhone app. I’m curious to experiment, though I’m cautious of the added workflow of getting the optimal exposure and updating ALL cameras settings for each shot. Also, while its too crazy for me (since it voids your warranty), there are ways to hack a GoPro camera to truly lock down the exposure.