NASA Show – Sneak Peek

The blog has been quiet recently because we’ve been knee deep in a show production. So here is a quick look at whats been cookin’.

We received a NASA grant to create a planetarium show about spacecraft engineering and the challenges of designing, testing, and launching! Also we wanted to feature real imagery, so we traveled to NASA Goddard and shot 360° video at their expansive test facilities.

From Dream to Discovery: Inside NASA Engineering
Available for distribution in January 2015


Glimpse Behind the Scenes

Below are some in-progress scenes from the actual fulldome show. We are currently in the polishing stage and we’re working to add the final details. I can’t wait to share more…

IPS-Macao Fulldome Festival: SELECTED!

Our show Moons: Worlds of Mystery has be selected to be a part of the competition screening during the IPS-Macao International Fulldome Festival 2014! So if you’re attending the IPS Conference, then be sure to mark your schedule.

After the festival, we have the added honor of being one of only six productions to continue screening for the public in Macao (6/21 – 7/31) and Beijing (6/28 to 7/31).


Stills From Moons: Worlds of Mystery

News: IX Symposium, Cinefex on Fulldome, 8k Research, Illustris Simulation


UPCOMING EVENTS

IXsymposium
IX: Symposium on Immersive Creativity

In just a few days this fascinating and first-time conference will begin in Montreal and last for 5 days. Last year I saw the SAT team perform at Immersa 2013, and they were already doing come cutting edge work. So I’m curious to see how this event kicks off.

The IX Symposium will provide an open exchange platform through a series of keynotes, panels, demonstrations, performances and immersive productions. An objective of the symposium is to help democratize access to immersive spaces and to the tools and processes used for the creation of original contents. Another objective is to foster a network facilitating the creation of a community and the circulation of people, ideas, and works. This will help rethink the models of production, the formats, the delivery systems, and the creative processes needed to maintain and nurture an international platform for strong artistic expressions and open innovation.

 

INT. BUEHNENWERKSTATT & INT. TANZTHEATERFESTIVAL 2014
Shortcuts: An Experimental Dance Film Competition

We are looking for videos, who look at dance through the new medium of 3-dimensional/spherical cinematography. Dance and an experimental approach should be at the center of the work. Focal point of interest should be real filmed scenes and not computer generated and/or animated film sequences.

 


ARTICLES

Cinefex_Dome-SaturnV-Mapped
Cinefex: Fulldome – Films in the Round

For Cinefex to write an article exclusively about fulldome, well thats a exciting sign of the times and simply cannot be ignored. Its refreshing to read this thorough overview which speaks to the the particular nuances of fulldome production. Such as the need to be a CG generalist, slow camera movements, cross-bounce worries, and using the Oculus Rift as a dome preview tool. And I can definitely agree when I read “Everything we do ends up on the dome”. Its true, to work under a tight timescale you must use basically use everything you render. By the time you render at 4k, you already know the edit and feeling of the show; so you’re just finalizing the visuals at that point. There is no post-production edit stage, as that instead happens during the animatic process since working at 4k is not forgiving. Truly an exciting article and perhaps now the CG community will have its eyes wide in awe.

 

NSC_8k_IMERSA_v01_Hi_Page_12
Fulldome 8k Research – Is There Any Point?

An in-depth and well researched thought experiment into 8k production concerns. And then to push the idea further, they theorize the max possible retinal resolution in the dome. This is a fascination write-up of the many aspects which I have been curious. Anyone in fulldome production must read this. Although I’m not quite sure how to hybridize the display resolution research of Paul Bourke into this article.

 

aaron-oculus-rift-hd-prototype
An Initial Look at the Oculus Rift HD for Fulldome Previewing

The Oculus Rift is starting to become an interesting way to preview fulldome dailies and watch shows when you don’t have immediate access to a dome. Aaron had early access to an HD prototype and has figured out the best domemaster input for the optimal experience.

 


INSPIRATION

Illustris: New Universe Simulation
Just one week ago, the Illustris simulation was announced to the world and the dataset has been opened up to outsiders. The simulation is simply stunning, as it is able to simulate the galaxies themselves moving within the cosmic web. Around 1 minute in the video is where the gas temperatures begins to cascade and I’ve never seen anything like it. A cosmic scale timelapse. And I expect we will see on dome sometime soon…


Planetary – Documentary Trailer
While not created for fulldome, how can I ignore such a beautiful film which speaks directly to something so close to my heart? It is this exact caliber of work that I wish to see in the dome. A expression of the shared struggle to explore the science collectively and combine it with the important but often ignored emotions of our daily struggles in these wild times.

“Planetary is a feature length documentary that expands on the mind-blowing perspective shared by astronauts and indigenous elders alike — that each and every one of us is inseparable from each other, the planet and the universe as a whole, a realisation that is both humbling and comforting. If more of us start to experience the simple truth of our interdependence and connection, we believe there is hope for the next stage in our human journey. Planetary is meant to inspire awe and wonder by seeing things as they are, placing our individual stories in the context of the larger planetary / cosmic story.”

Crossing Worlds – Fulldome Short Film
A visual tone poem designed for the emerging fulldome planetarium format, Crossing Worlds utilizes spherical photography from the American desert west to immerse the viewer in a transcendent spectrum of austere landscapes. “Crossing Worlds served to remind audiences of how extraordinary our home planet is. It was the perfect complement to the data-driven visualizations of global changes, deepening appreciation for the critical importance of the natural world for sustaining us emotionally as well as physically” said David McConville, co-founder of The Elumenati.

Fiske Fulldome Basics
Its difficult to explain to the public the resolutions which modern fulldome can achieve. For instance, when most people hear of 4k they think of the TV standard. So it can be tricky to explain the public that 4k refers to something different in the fulldome standard. This video is a wonderful and simple explanation of the differences between 1080 broadcast television to 8k fulldome.

Waiting Far Away – Fulldome Short

“An explorer of the cosmos has traveled too far… And can’t find home.”

In the creative process of producing planetarium shows, we often come across imagery that is stunning but doesn’t work in the context of a science show. And so our collection of fulldome astronomy art animations has matured into a hybrid form of storytelling where we mix imagination with real data.

What particularly excites us about backburner short projects, is that we produce them quickly. Its refreshing to just have a clear vision and then immediately crank it out. In the future we plan to continue creating fulldome shorts. Our next short will have a bleeding edge science topic to focus on. But we have other ideas for experiences in outer space that don’t fit your typical science show. More to come!


Domemasters Freely Available

  • Available for planetarium use. Contact me and I’ll give you an address to ship an external hard drive. Please include a SASE or IRC.
  • 4k domemaster frames, 30fps, 5.1 & stereo audio, (200GB).
  • 1k Quicktime MOV available for download by request (2.6GB).

Terms: permission to freely screen to the public in planetariums as you see fit. You must screen the short in full and unedited. Not to be used in other shows without permission.


Translations

Wish to translate and record the narration into a language that better suits your audience? Let me know and I can share the music-only 5.1 surround mix.

Languages Available
— Spanish
— German
— French
— Italian
— Polish
— British English
— Portuguese


Screenings

Conferences & Festivals
— Jena Fulldome Festival 2014 (Jena, Germany)
— 2014 European Symposium of Planetariums (Lucerne, Switzerland)

International Planetariums
— Macao Science Center Planetarium (Macao, China)
— Planetarium Wolfsburg (Wolfsburg, Germany)
— Sri Sathya Sai Space Theatre (Puttaparthi, India)
— Baikonur Planetarium (Poland)
— Il Planetario (Bologna, Italy)
— Orbit Night Sky Planetarium (Kolkata, India)
— Roi-Et Planetarium, Science and Cultural Center for Education (Phitsanulok, Thailand)
— Discovery Dome Europe (Birmingham, UK)
— Planetarium Toruń (Torun, Poland)
— Astropokaz Planetarium (Poland)
— Fireball Planetarium (Newfoundland and Labrador, Canada)
— Planetario Digital Carl Sagan (Parana, Argentina)
— Tusi-Bohm Planetarium (Baku, Azerbaijan)
— Quasar Planetarium (Olkusz, Poland)
— Planetarium and Observatory of Cà del Monte (Cecima, Italy)
— Grupo Astronomico Silos (Zaragoza, Spain)
— The Heavens of Copernicus Planetarium, Copernicus Science Centre (Warsaw, Poland)
— Portable Planetarium (India)
— Portable Planetarium (France)
— Cosmos Planetarium (Scotland)
— Planetarium RCRE (Opole, Poland)
— Museum of Paleontology Egidio Feruglio, Portable Planetarium (Trelew, Argentina)
— Planetario de Bogotá (Bogotá, Colombia)
— Eugenides Planetarium (Athens, Greece)
— Portable Planetarium (Natal, Brazil)
— Portable Planetarium (Brazil)
— Portable Planetarium (Bolivia, South America)
— Sternwarte und Planetarium der Stadt Radebeul (Radebeul, Germany)
— Portable Planetarium (India)
— Portable Planetarium (Brazil)
— Portable Planetarium (Sussex, England)
— Portable Planetarium (Russia)
— Planetarium de Bretagne (Pleumeur-Bodou, France)
— Illusion Planetarium (Tyumen, Russia)
— Strasbourg Planetarium, Université de Strasbourg (Strasbourg, France)
— Astronomy Club of Feira de Santana, Antares Observatory (Bahia, Brazil)
— Portable Planetarium (Poland, Katowice)
— Planetarium of Nantes (Nantes, France)
— Planetarium Hamburg (Hamburg, Germany)
— Portable Planetarium (Matera, Italy)
— Immersive Vision Theatre, Plymouth University (Plymouth, UK)
— Portable Planetarium (Valencia, Spain)
— Digital Mobile Planetarium Wenu Mapu (Rio Negro, Argentina)
— Planetarium Sultan Iskandar (Sarawak, Malaysia)
— ESO Supernova planetarium (Garching, Germany)
— Planetarium Minikosmos (Lichtenstein, Germany)
— Planetarium Man (Northampton, UK)
— Guru Graha Planetarium (Andhra Pradesh, India)
— Zeiss-Planetarium Jena (Jena, Germany)

USA Planetariums
— East Village Planetarium, The Girls Club (New York, NY)
— Hokulani Imaginarium, Windward Community College (Kaneohe, Hawaii)
— Daniel M. Soref Planetarium, Milwaukee Public Museum (Milwaukee, WI)
— Dr. Sandra F. Pritchard Mather Planetarium, West Chester University (West Chester, PA)
— University of Alaska Anchorage Planetarium (Anchorage, Alaska)
— Ward Beecher Planetarium, Youngstown State University (Youngstown, OH)
— Boonshoft Museum of Discovery Planetarium (Dayton, OH)
— Acheson Planetarium, Cranbrook Institute of Science (Bloomfield Hills, MI)
— Neag Planetarium, Reading Public Museum (Reading, PA)
— Longway Planetarium, Sloan Museum (Flint, MI)
— University of Texas at Arlington Planetarium (Arlington, TX)
— Rollins Planetarium, Young Harris College (Young Harris, GA)
— Wausau School District Planetarium (Wausau, WI)
— Sudekum Planetarium, Adventure Science Center (Nashville, TN)
— Raritan Valley Community College Planetarium (Somerville, NJ)
— Delta College Planetarium (Bay City, MI)
— Buhl Planetarium, Carnegie Science Center (Pittsburgh, PA)
— Maynard F. Jordan Planetarium, The University of Maine (Orono, ME)
— Obscura Digital (San Francisco, CA)
— Peoria Riverfront Museum, Planetarium (Peoria, IL)
— Glastonbury Planetarium (Glastonbury, CT)
— Edelman Planetarium, Rowan University (Glassboro, NJ)
— Harlan Rowe Middle School Planetarium (Athens, PA)
— Hallstrom Planetarium, Indian River State College (Fort Pierce, FL)
— Starlab Portable Planetarium (Massillon, OH)
— Charles W. Brown Planetarium, Ball State University (Muncie, IN)
— Flandrau Science Center & Planetarium, The University of Arizona (Tucson, AZ)
— Dreyfuss Planetarium, Newark Museum (Newark, NJ)
— Clark Planetarium (Salt Lake City, UT)
— Downing Planetarium, Fresno State (Fresno, CA)
— Gates Planetarium, Denver Museum of Nature and Science (Denver, CO)
— Ask Jeeves Planetarium, Chabot Space & Science Center (Oakland, CA)
— Truman State University Planetarium (Kirksville, MO)
— SUNY Oneonta Planetarium (Oneonta, NY)
— Portable Planetarium (Antioch, CA)
— Arthur Storer Planetarium (Prince Frederick, MD)
— WVU Planetarium, West Virginia University (Morgantown, WV)
— Adler Planetarium (Chicago, IL)
— York Learning Center Planetarium, York County Astronomical Society (North York, PA)
— COSI Planetarium (Columbus, OH)
— FLHS Planetarium (Fair Lawn, NJ)
— Austin Mobile Planetarium (Austin, TX)
— Argus IMRA Planetarium (Ann Arbor, MI)
— Anchorage Museum, Planetarium (Anchorage, AK)
— Calusa Nature Center & Planetarium (Fort Myers, FL)
— Northside ISD Planetarium (San Antonio, TX)

Distributors
— Sky-Skan, Europe
— Sky-Skan, USA
— Spitz
— ZEISS Powerdomes
— Digitalis Education Solutions
— Fulldome Film Society
— Discovery Dome, Europe
— EnterIdeas
— Emerald Digital Planetariums
— Lochness Productions
— LSS Planetariums Open Project

Visuals by Wade Sylvester & Jason Fletcher – Music & Narration by Wade Sylvester
Charles Hayden Planetarium, Museum of Science. Copyright 2014. All Rights Reserved.

Fulldome Interviews: Shane Mecklenburger – Art/Science Course at OSU

jason-teaching-class

Teaching 3D animation is no simple task. Especially if the center of interest is astronomy and fulldome! So in September 2013, I was asked to teach my astronomy visualization techniques as a visiting artist at OSU.

Shane Mecklenburger is the 3D animation professor of a class called “The End & the Beginning of Everything”, which received a Battelle Endowment. Participating OSU astronomers and astrophysicists paired with the animation students to create a dialogue and help inspire their artwork.

Its interesting to note that Shane wanted to be careful to not focus on purely data visualization. Instead he wanted the students to use the techniques to make their own artwork inspired by real astronomy and astrophysics research. This allowed the students the creative freedom to go in any direction, with the foundation of astronomy to work from. Since I’ve spent every working day for years creating outer space imagery, its was a natural fit to give a few technical workshops.

The work of the students will be exhibited at the Adler Planetarium in Chicago sometime in 2014 or 2015. They have also been experimenting with fisheye rendering and viewing them in the OSU Smith Laboratory Planetarium, which was recently refurbished with a 2.6k projection system.

In the first workshop we covered star fields, galaxy fields, and close-up fluid suns. Then in the second workshop we covered 3D nebulae, hall of mirrors, and DomeAFL installation. It was a wonderful experience to share my techniques with students and see them learn so quickly.


Class Description

The End & the Beginning of Everything is a collaborative art-science initiative between the OSU Departments of Art and Astronomy, the University of Chicago Department of Astrophysics, Chicago’s Adler Planetarium and the Advanced Computing Center for the Arts and Design (ACCAD). Accelerating technologies are amplifying astronomers’ ability to model and observe, expanding our understanding of life and the universe. This initiative guides young artists in creatively interpreting astronomical research for a public contemporary art exhibition.

supernova-shane-mecklenburger


Interview with Shane Mecklenburger:
3D Animation Professor at OSU

Over the years, I’ve noticed a gradual increase in the offerings of college classes that focus on astronomy art or fulldome. So I’d like to interview Shane and hear his thoughts about this experience.

Can you share a little about yourself as an artist and teacher?
Lately my projects are exploring exchange and simulation. I’m also interested in the origins and effects of science and technology, which are often tangled up with real or imagined conflict. I teach courses on cross-disciplinary collaboration and new media art in the Art & Technology area in the Department of Art at The Ohio State University. One of my courses is Experimental 3D Animation, and another is called ArtGames, which is about using games as a format for making artwork. Also, this semester I’m teaching a new graduate level seminar I developed called APPROACHING SYSTEMS.

What inspired you to create a class combining 3D animation and astronomy?
I was working on a project called The End & The Beginning of Everything, where I’m collaborating with astronomers who make simulations of supernova explosions. It occurred to me this would be fun to do with my Experimental 3D Animation class, so I wrote a grant to connect them with astronomers at OSU to make artworks based on their research.

You mentioned to me that visualizing orbiting planets was a default starting point for the students. Once you were able to break the students of that, how have their perspectives changed in terms of what to visualization? Was there a tipping point?
I guess the tipping point was when I banned spheres from their animations! I did it after the first project, as a challenge to get them thinking in more abstract ways. The point of the course is to make video artworks that aren’t simply science visualizations. So the course is meant to develop the non-literal, symbolic dimension of students’ art making abilities while learning about astronomy.

Do you think this class has changed how the students perceive astronomy?
I suspect it might have, to differing degrees depending on the student. It’s hard not to come away with a new perspective after spending time at an actual astronomy research facility, talking to astronomers, and coming to understand the phenomena and theories they’re researching. I know it happened for me.

What has been challenging in teaching this class?
Just about everything! Many of the students had no experience with 3D animation and we were working with very advanced animation techniques like dynamics and fluid simulations. We were also making sound tracks, and using a render farm to produce fulldome masters, so the technical hurdles were extreme, and I was learning most of the fulldome techniques myself on the fly. Dealing with these kinds of problems are what make teaching fun for me, and I hope they also demonstrate for students how complicated and unpredictable this kind of work can be.

What do you hope the students will take from the course?
Three things: First, I hope they come away with artwork they’re proud of. I also hope students emerge with a better sense of their own unique creative approach, style and voice. Finally, I hope they come away with a sense of the special challenges and importance of cross-disciplinary collaboration.

Do you think a fulldome specific class at OSU has merit?
Absolutely. As a pilot/test, I feel this course has demonstrated that, while there are still technical challenges to sort out, the interest is there from both students and faculty.


Student Artworks

360° Video Fundamentals

I have always been excited at the possibility of 4k video in a planetarium dome. And so I was captivated with the recent introduction of a 360° video camera rig with 8x4k resolution. (Which translates to 4k domemaster resolution.) It also meant that I could increase the fisheye FOV from 180° to 220° and see the immediate ground surrounding the camera. In my opinion this makes for a heightened immersion experience. So I have spent the last two months experimenting and learning directly about the intricacies of shooting 360° video.

The 360Heros H3pro10HD is a 3D printed object. Meaning its one solid piece of plastic that is precisely engineered to fit 10 GoPro cameras into the smallest possible space. Its printed using aircraft grade plastic, so its durable and has been through a strenuous bend test to prove its strength over time.

Currently the 360° video community is tiny and little documentation is available. So I was on my own to figure out the potential problems, shooting subtleties, and overall workflow. This can be a tedious and nerve-wracking process. After all, with 10 GoPro cameras shooting in unison, something is bound to go wrong at some point. So alas, plan within plans within plans, theorize contingencies, and take notes of your experience. And now for you brave souls remaining, below are my own findings, tips, and thoughts.


Hardware Rundown

Camera & Memory
— H3Pro10HD Rig
— 360Heros Aluminium Mount
— GoPro HERO3+ Camera – Black Edition (x10)
— Lexar 32GB microSDHC 600x [LSDMI32GBSBNA600R] (x10)
— Manfrotto 496RC2 Ball Head with Quick Release
— Headless Tripod
— Manfrotto 200PL-38 RC2 Plate

Extra Batteries & Charging
— Wasabi Power Battery [2-Pack] and Charger (x10)
— PowerSquid D080B-008K (x2)
— 7-Port USB Charger (x2)

Transport
— Pelican 1550 Case with Foam
— Fotodiox GT-H3Lens-Cap GoTough (x10)

Stitching & Rendering
— Autopano Giga & Autopano Video Pro [known as AVP in this blog post]
— GeForce GTX 770 4GB
— 360Heros File Data Manager


Big Picture Workflow

  • Setup the Rig
  • Shooting Workflow
  • Import & Check the Footage
  • Stitch & Render

There are different mindsets and considerations depending of what action you are performing. So I’ve split up up this information into 4 main sections.


Setup the Rig

  1. Put camera’s into rig (matching camera # to rig #)
  2. Press wifi-button on each camera
  3. Check that each camera’s recording settings are correct: Example Display

Its a bit of a trick to get all the GoPro cameras installed without touching the lenses. But having the plastic rig locked onto a tripod helps. I chose to permanently install the aluminium mount so that 7 of the GoPros are looking at the horizon. Since I shoot for a dome, this can ease stitching pains later on since often the objects of interest are in these camera zones. This leaves 1 camera pointing straight up and 2 pointing down at the tripod. Interestingly, if a camera fails then I can trade it out for 9 or 10 and still not lose any essential zones. So there is some redundancy there. I could also remove 9 & 10 altogether and still capture 280° of footage.

NASAGoddard_SetupRig

Also, removing the cameras from the rig is difficult due to the tightness of the plastic camera clamps. But this isn’t a complaint, its actually a good sign that everything is snug and won’t wobble during shooting. So to remove the cameras you must use a small flathead screwdriver (included with 360Heros rig) to softly wedge the plastic tab up from its locked position.

Its important to number each of the cameras and the memory cards with a sharpie. If a camera begins to act strangely then I can easily trace it back to the source. It also allows me to understand exactly where a camera is within the rig, which can be helpful for when the automatic stitching doesn’t work. This is rare, but can happen if the video frame is of a pure blue sky with no overlapping content to match up.

The first time you setup the cameras, you will need to setup the default video recording settings for each camera by hand. From that point forward the cameras will remember those settings. It is absolutely vital that each camera has the exact same settings.

Camera Settings for Best Resolution
— Protune: On
— Resolution: 2704×1524
— Aspect Ratio: 16:9
— FPS: 30
— Low Light: Off
— White Balance: Cam RAW

360VideoSettings_FullSetup


Shooting Workflow

  1. Power ON the cameras with the wifi remote
  2. Check each camera for corrupt symbol (rare)
  3. Press Record on wifi remote
  4. Quickly check each camera for red blinking recording lights
  5. Motion sync the cameras. Twist tripod stem quickly. VERY IMPORTANT!

Whats the need for motion sync? Well the wifi remote doesn’t trigger all the cameras at the same moment. So instead AVP must sync each of the video clips itself, otherwise your footage could be offset on the timeline and look terrible. There are two options: audio sync or motion sync; but motion sync is more precise. All you have to do is quickly spin the whole rig back and forth to provide the necessary data calibration for AVP to use later. The audio sync needs a sharp fast snap to sync to, like a dog training clicker.

A wonderful aspect of using GoPro cameras is that one wifi remote can trigger all 10 cameras. So they can be remotely powered ON/OFF and record start/stop. The remote indicates how many cameras are connected.

Because each of the camera lenses are not at the same nodal point, you’re undoubtedly going to see parallax errors when stitched. There are some techniques to address this in AVP, but it’ll take time and hand polishing. But to combat this upfront, its wise to constantly be thinking about parallax when planning a shot. My suggestion is to keep the camera rig at least 10 feet away from everything. That can be tricky and not always an option, but its the guideline I try my best to follow. Know that the larger the environment that you shoot, the easier your stitch job will be.

The most difficult aspect of shooting for a dome is keeping the camera movement buttery smooth. Any subtle bumps in the footage will be greatly amplified on the dome. I tested a wheelchair and the motion was smooth but not perfectly parallel when moving forward. I also tested a motorized wheelchair and that produced satisfying results when used at the slowest setting. But the real perfection is with the use of a tracked dolly. Yet then you run into the issue of seeing the track within the 360° shot. So I used the Singleman Indie-Dolly with 12 feet of track and that was just short enough to not show within 240°-ish of the footage. Its incredibly smooth and the smallest movement forward looks stunning in the dome.

NASAGoddard_TrackedDollyShot

Perhaps the most frustrating aspect are the GoPro cameras themselves. GoPro’s are not exactly known for their dependability. Some things to watch out for:

  • Camera battery life leaves me wishing for more. 1h 30m of shooting time is pretty good, but I haven’t timed it when continually turning ON/OFF the whole rig. So I have a total of 30 batteries charged and ready. I definitely turn off the cameras when not recording, as they eat battery life when in stand-by mode. I’ve heard that changing the settings from the default 3 red blinking recording lights and instead seeing only one red blinking recording light can help to save some battery life too. Leaving on the wifi overnight (blue blinking lights) will drain a battery life in 24 hours.
  • Be sure to use microSD cards that are recommended by GoPro. Some of the other brands don’t write as fast as they advertise and this can lead to the camera stopping recording randomly due to a full buffer. I’ve had good results with the Lexar 32GB microSDHC 600x and the footage files have a constant data stream of 45mbits/sec, with no skipped frames. But in the future I’d suggest the 64GB version. Recording with 2.7k settings will allow you to record 1h 30m of footage. So using a 64GB card would double that. Dumping footage takes about 1h 30m of diligent work, so if you’re in the middle of a big shoot, the last thing you want to do is be interrupted. I’ve had absolutely no problems with the microSD cards. But if you have an “SD error” on the GoPro, accidentally deleted footage, or the microSD card is unreadable, then check out Test Disk.
  • Occasionally a video file will corrupt. You’ll know because the camera display will show the corrupt symbol and you must push a button to continue. So this is something that I’ve added into the shooting workflow. Its not entirely clear what causes this, other than the fact that these high resolutions have a high data rate and sometimes the header doesn’t get written to file. Luckily the video data can be repaired when transferred to a computer. I’ve yet to have a case where the video data is lost.
  • The cameras are not up to par when it comes to long duration shots. A continuous shot of 30 minutes would make me nervous. I had one occasion where a camera froze; meaning the firmware froze up and the battery had to be removed to reset it. And you’ll know its frozen, as no buttons will function. Either it won’t power on, won’t begin recording, or stops recording mid-shot. So as a precaution, I am always checking for the red blinking recording lights at the beginning and ending of a shot. Upon freezing, the red recording lights stop blinking and then it is no longer recording video. But any footage prior to the freeze is retained (though it might be corrupted).
  • If the footage count on the front camera display isn’t identical between all of the camera’s, then that camera wasn’t recording during the shot and so that channel will be missing for stitching. Either one camera didn’t trigger via the wifi remote or the camera firmware has froze. So again, always check for the red blinking recording lights at the beginning and ending of a shot. Its obviously wise to keep notes in your shot log of these mishaps.
  • Even with all that said, they perform pretty well. For example, out of 57 total shots: 4 shots had corrupted files which were all later repaired in full. 1 essential camera had a battery die and lost half the shot. Twice a camera wouldn’t sync with the remote, but it was a non-essential camera facing the tripod and recorded fine without it (it needed a battery reset). And once an essential camera froze up after I had started recording and the shot is unusable. So obviously, get multiple takes of each shot!

Shooting for a dome means that I don’t actually need the complete 360° of footage. So I use a tripod to increase stabilization. Some people use a monopod that has tiny tripod legs. That is useful to capture as much FOV as possible and actually opens up the possibility of complete 360° with some trickery in post-production to hide the monopod itself. But in moving shots, a tripod is required.

Keep a shot log while shooting on location! It will be invaluable since you will undoubtedly forget the shot locations, problems, noted bumps/wobbles, bad takes, reminders, and such. Be diligent about this, as having a shot log will make your importing process much smoother.

Each of the cameras are locked to auto-expose. And I mean permanently locked off from user control. It’s just the way GoPros have always been. But luckily AVP is extremely powerful in its exposure/color correction. Yet it can be difficult to deal with when you’re very close to a bright light source and only one camera has a wildly different exposure value. Even still, AVP can correct between pretty serious exposure differences since the GoPro cameras have excellent latitude.

An exciting option is to shoot in slow motion at 240 FPS. Using this FPS will reduce your end output to 1k fisheye (2x1k equirectangular). Or shoot at 120 FPS and output to 2k fisheye (4x2k equirectangular).

The bottom of the 360Heros aluminum mount has an 3/8″ screw hole. (Contrary to the typical 1/4″.) So you will need to get a special quick release plate to put it on any tripod.

Battery Runtime Test
— Official battery, using wifi remote, recorded 2.7k footage for 1h 32m
— Wasabi battery, using wifi remote, recorded 2.7k footage for 1h 19m


Import & Check the Footage

Wherever you shoot, you’re going to need a laptop and external hard drive to dump the video data. For each hour of video taken, you will need about 340GB of free space.

Using an external USB 3 microSD memory card reader is an absolute requirement. Since each card can hold 32GB, you’ll want to be able to dump the data quickly. So dumping the files straight from the GoPro cameras isn’t an option since the USB 2 speeds are too slow. Even with USB 3 speeds, it still takes about 1h 30m to dump all the micoSD cards to disk.

Since each camera creates footage with similar filenames, its imperative that the you rename the videos files as they are copied to your computer. Otherwise the possibility of overwriting data is quite possible. But renaming by hand isn’t needed if you use the 360Heros File Data Manager to easily dump the video files for you. I underestimated the usefulness of this command line tool. It automates the process of appending the camera number and project name to the video filename, and then it dumps the footage into a folder. Here are the quick instructions of its use. Don’t worry, although it doesn’t have a GUI, all you need to know is how to CD via the command line prompt.

Finally its time to see how the footage actually turned out. Hopefully you have kept a shot log to help inform this process. But I always create a spreadsheet and keep track of the following per shot:

  • Concatenate needed? Due to the FAT32 memory card format, any shot that reaches 3.8GB will be automatically split into separate video files. (Or with 2.7k settings: every 11m 38s)
  • Corrupted video file? Infrequently a video file will corrupt; meaning the header won’t write to file. Luckily the video footage can be repaired.
  • Missing channels? Rarely a GoPro camera will freeze up. Either it won’t start recording or freeze up during a shot. Hence the need to check for the blinking record light in the beginning and ending of a shot.
  • Shot length match between cameras? If a camera battery dies in the middle of a shot, then you’ll still get the footage up to a certain point for that camera. But obviously the rest of the cameras will continue recording.
  • Shots organized? Each collection of 10 related video files will need to be placed into its own file folder to best keep individual shots organized for stitching. So from the huge list of dumped files you must figure out which cameras are related to one particular shot. This is definitely tedious and especially confusing with files that need to be concatenated. My shortcut is to organize the files by Date Modified and have the Length attribute visible.

Stitch & Render

Finally you have all your shots organized and ready to stitch in AVP. What’s amazing is that you don’t need a AVP template for the specific camera rig layout. Just drop all the videos into into AVP, synchronize, and stitch. It uses the overlapping content between each video to automatically create a 360° projection. Its not always perfect, but its damn good most of the time.

Here are a few sources that were helpful in learning Autopano Video.
Autopano Video Wiki Documentation
Autopano Video 101
Autopano Video Forums
360Heros FAQ

Final Render Resolution
AVP can render whatever type of projection you need to a frame sequence. Full resolution examples below. (These are not polished renders, just the initial automatic AVP stitch.)
— Equirectangular (Spherical): 8192x4096px
— Fisheye (FOV 180°): 4096x4096px
— Fisheye (FOV 220°): 5000x5000px

I was happily surprised with the fact that upon increasing the FOV to 220°, then the final render resolution increases to 5k. This is possible because more footage is viewable at a higher FOV, and so you are effectively “squishing” more pixels into the fisheye image. So when I project it onto the dome at 4k, then I have the added benefit of increased sharpness since the video has been downscaled.

I render using the Equirectangular (Spherical) projection because often I want to add slow movement to the virtual camera in After Effects with the Navegar Fulldome Plugin.

The only difference between Autopano Video and Autopano Video Pro is the GPU render option. Which makes a huge difference in render times. When rendering at these high resolutions, you need a graphics card that has lots of memory. So the GeForce GTX 770 4GB has a decent amount of cores, lots of RAM, and at a good price point.

Final Render Statistics at Full Resolution (Equirectangular)
— CPU: 0.05 render fps – takes 12 hours to render 1m 30s of video.
— GPU: 1.5 render fps – takes 1 hour to render 1m 30s of video. (GeForce GTX 770)

Right when you get the footage, you’ll just want to see a 1k preview quickly. The good news is that fisheye rendering at 1k in AVP is quite fast at 10fps on the GPU.

As of AVP 1.6, they have implemented a stabilization algorithm to help smooth out bumps. It definitely reduces any bumps by 50% and I read that will be improved in the next few versions. They are also adding a timeline which can keyframe cameras and color; which means that you can interpolate between stitches. That is powerful when trying to fix parallax on a moving dolly shot. Its also very helpful when trying to fix an unruly camera whose exposure is way off from the others (such as a camera pointing at the ceiling).

When using GoPro cameras, the stitching can be optimized since the camera sensors aren’t perfectly aligned/centered with the lens and therefore each have their own lens distortion model. So you can pass AVP this information in the the Lens Distortion Correction settings. As of version 1.6 the settings have been moved and updated.

The video and render examples showcased in this blog post do not yet have their stitches hand polished in AVP. They only use the initial automatic stitch because I just wanted to see the results quickly. So any obvious seams and parallax errors you see could be fixed with some hand polishing. I’m still learning how to best use the software. But you need to run alot of experiments to get polished results and possibly even composite multiple AVP renders in After Effects to get the most seamless results.

The GoPro cameras do a great job of capturing color and have excellent exposure latitude. But a photo filter in After Effects is often needed to color correct for tungsten lights or daylight. Then the colors need a big saturation boost to look optimal in the dome. And the use of curves can enhance dark colors instead of using contrast. And maybe a slight touch of sharpening.

This AVP explanation could easily be a whole write-up of its own. Perhaps in the future I’ll share my experience once I’ve learned more. I’m currently in the process of stitching 57 shots (900GB total unprocessed) taken at the NASA Goddard Space Flight Center. It was an intense 3 day shoot and we got some amazing 360° shots that I can’t wait for you to see in our upcoming planetarium show production.

Lastly, if you’re shooting with the stereoscopic 360heros rig, then you will have a slightly different workflow in AVP. I don’t even want to fathom the bizarre difficulties with 3D 360 video.

Both the normal and stereoscopic 360 videos can easily be experienced on the Oculus Rift (using Kolor Eyes, which is free).


And that’s the Gist of it

As you can see its not the faint of heart, as it demands a pretty wide range of technical knowledge. But the results can be simply stunning. This is a newborn medium and there is alot of room to grow. But its an exciting time for fulldome show producers, hyperdome possibilities, and the Oculus Rift community. 4k video in the dome has long been a dream and I think this is an bold step into a fascinating frontier.


Addendum

Update: November 21, 2014 – Looking for a simple description of 360 video and how it works? David Rabkin and I wrote a paper that was presented at the 2014 GLPA conference: “Spherical Video: You Have the Power”. So if you want the basics without getting too technical, then this is your best bet. Full cost assessments: H3Pro10HD or H3pro7.

Update: November 20, 2014 – Here is an interesting roundup of 4k video cameras in the marketplace. So it seems that 360 video is still the best option for creating 4k domemasters, especially since 4k video cameras still don’t have sensors with at least a 4k vertical. So we are still a number of years away from 8k video cameras being mainstream. Yet its quite interesting to see 4k become a standard within video camera equipment.

Update: October 30, 2014 – If you are wondering about the new GoPro Hero4 camera… Having each camera shoot at 4k 30fps is amazing, but it comes at a cost. Results from the 360 video community suggest that you can only shoot for 30 minutes before needing to cool down the cameras. And sometimes the Hero4 camera will forcefully shut down due to overheating.

Update: March 12, 2014 – The GoPro cameras have a firmware update available that allows you to shift the automatic exposure through the iPhone app. I’m curious to experiment, though I’m cautious of the added workflow of getting the optimal exposure and updating ALL cameras settings for each shot. Also, while its too crazy for me (since it voids your warranty), there are ways to hack a GoPro camera to truly lock down the exposure.

Jupiter Bands Simulation

Just a few days ago, I found out about a MIT competition called The Art of Astrophysics. Naturally my interest was piqued. The only problem was the deadline… 24 hours!

As a back burner project I’ve been experimenting with creating jupiter cloud bands that are truly fluidic. Its very difficult to keep the multiple bands separate, so I’ve been testing the interaction between just two bands. The Kelvin–Helmholtz instability is a fascinating topic to focus on, but its heavy to simulate and difficult to predict.

Often when I begin these type of experiments, I’m on a grand impossible mission. Those make for the most interesting challenges right? And this time I was on a mission to create jupiter or a hot jupiter as a fluid dynamic system. In the past I’ve created a hot jupiter animated texture in After Effects, using the turbulence displacement and the twist effects. But I wanted a system that was self-reliant. And so, as always, I get stuck on the quest for that holy grail and need a jolt to remind me that often the steps along the way are quite interesting.

So I selected one particularly beautiful simulation in Maya, cached/rendered it, and then spent the rest of the evening tweaking in After Effects. It was fun trying to burn out and degrade the footage to make it appear as if it was raw data from a satellite or space telescope. A bit far fetched, I know, to see an extreme zoom of a hot jupiter, but hey I had a vision and had to run with it. So in homage to that fact, I appropriately named it ‘The (Fictitious) Formation of a Hot Jupiter Cloud Band’.

You might be wondering why I cropped the video in this strange boxed style with burnt edges and film grain… Well I was trying to mimic the the style raw data of a satellite. I actually used this jupiter image as a the source for the border and overall inspiration. I’m not sure if its simply a stitch of several images, the sensor blocking out specific light, or such. But I figured it would be a realistic addition instead of the all-too-perfect Maya render.

Jupiter-Sim-Comp-Process


I’m not going to explain all the details of this Maya simulation, because honestly its just alot of trial and error. Lots of playblasts which need overnight to compute. But see below for the notes which can serve as an introduction for further experimentation.

DOWNLOAD
maya scene – jupiter bands simulation

— The fluid box is wrapping left/right, with the top/bottom closed off.
— There are two volume emitters for the top and bottom half that are keyframed to emit density and velocity at the start frame then turn off quickly.
— There are also two “wind” emitters on the sides to keep the incoming velocity at a fixed rate. They use the replace method and have speed emission with a low rate. The target speed is determined by the directional speed attribute. The rate is the amount that changes the velocity on the fluid to that target speed. So the rate is low in order to keeps things going but not influence the simulation too much. But you can confine the shear region more by increasing the rate.
— There is a very small amount of negative density buoyancy to keep things from mixing too much over time. Increasing the strength of the buoyancy will tend to force the boundary closer to the center, providing the solver quality is high enough or else it will collapse to the bottom. Again tricky business.
— High detail solve and a high grid resolution is used to get more fluid detail. The higher the grid resolution of the fluid, then the more substeps and solver quality that are generally needed. Although substeps also determine how fast the fluid flow is. So its a tricky balance. If there is still too much diffusion, try yet higher grid resolution, substeps, and solver quality. Also a higher resolution, higher solver quality, and more substeps should create more small scale eddies.
— For a little extra detail you can play around with the Density noise, Density tension, Gradient force. Try tiny values between 0.01 to 0.05

I couldn’t have got this far without the help of Duncan Brinsmead, Principal Scientist at Autodesk. I contacted him because I was curious of his approach to this type of simulation and his response was insightful.

The song in the video is In the Hall of the Mountain King by Edvard Grieg. I am amazed at the beautiful diversity of classical music recordings that are public domain licensed on the MusOpen website. What a serendipitous boon finding this!


More about The Art of Astrophysics Competition:
Astrophysicists try to share the mysteries of the Universe around us in a clear and understandable fashion, but we don’t always succeed. It’s a hard challenge – the wonders of the Solar System, the Galaxy, and the ever expanding Cosmos demand more of our imaginations than can be captured by numbers in a table or terms in an equation. However, a work of art can uniquely inspire us to look closely, to dream freely, to understand openly – anything from the smallest curiosity to the biggest discovery.

So, we’re asking members of the MIT community to create works of art that help us visualize our Universe and how we observe it. Whether you’re a photographer or a poet, a crafter or a coder, a musician or a moviemaker, we want you to use your talents and creativity to illuminate the beauty of astrophysical results. Please consider participating in this year’s Art of Astrophysics competition during MIT’s 2014 Independent Activities Period (IAP), sponsored by the MIT Kavli Institute for Astrophysics and Space Research.

View All of the Art Submissions — View the Winners

Fulldome Artist in Residence: Call for Applications

vortex-dome-la-mobius8-002

I’m excited to see the launch of a fulldome artist residency to help open up the medium to artists. I’m particularly interested that dome production training is a major part of the training. With so few offered workshops and college classes in the world, its difficult to break into the medium on top of all the technical hurdles and getting dome access. The fulldome world is a narrow but growing market and so this is an important resource for allowing artists to experiment on an actual dome. More info below.


Vortex Immersion Media is now accepting applications for its first international artist in residence program located in Los Angeles, California. The program is for artists interested in experimenting and exploring fulldome technology and immersive experience. Artists will receive hands on training provided by our Creative Director, access to the dome to work on projects and test concepts, one on one consultation and a public presentation of their experience at the end of the residency. The Vortex Immersion Media fulldome artist in residence will run June 1 – 30, 2014.

Vortex Dome LA is a state of the art facility located at the heart of L.A. Center Studios in downtown Los Angeles. The dome is 50 feet in diameter with a four projector system displaying a 2k seamless image. Vortex’s custom built software allows for on the fly slicing for drag and drop playing of dome masters, real time video mixing, multi image display, and external sensor integration for interactive works. Vortex Immersion Media specializes in production and design for effects, gaming, interactive, experiential themed entertainment and immersive dome installation.

Ethan Bach, Creative Director, is internationally known for his media art which is primarily in immersive and interactive media. He received his MFA in Electronic Arts from Rensselaer Polytechnic Institute. Bach was the Digital Dome Director at the Institute of American Indian Arts where he served as Principal Investigator for a DoD grant developing interactivity for fulldome and as research associate for a National Science Foundation grant developing tools and content for fulldome environments. Bach has run AIR programs for domes as well as curated shows for such events as the International Symposium on Electronic Art (ISEA), the New Media Caucus and Currents International New Media Festival.

Ethan Bach and Vortex Immersive Media came together to find a solution for the growing need of artists to work in fulldome facilities. There is an equal need within the dome community for new content and expressions in this fast changing medium. Recent innovations in dome technology allow for a variety of visual and aural expressions that turn the dome into an artist’s playground. “Please come join us as we explore this technology.”

Support: Artists receive 16 hours of training in dome production, up to 3 hours per week of dome time for testing concepts, one hour week consultation with Creative Director, public presentation accompanied by curated dome art show, and opportunity for licensing agreement for distribution of their work.

Costs & Support: $3,000 individual and $1500 each additional artist group member (up to three in each group). For example; two artists in a group would cost $4,500 and three in a group would cost $6,000.
Location: Los Angeles, California
Duration: June 1, 2014 to June 30, 2014

Submission Deadline: March 14, 2014 11:59pm PST
http://opps.residencyunlimited.org/2014/vortex/fulldome-artist-in-residence/

Background Stars v2

example_render(This post is an update to Background Stars v1. It will provide some context.)

A while back I shared a ‘star globe’ which has the night sky mapped onto a sphere. This can be used to completely surround a maya scene with stars. Andrew Hazelden and I often collaborate on various fulldome projects and he had an idea of how to re-engineer the star globe into requiring only 1 surface and 1 file texture. This allows for a vast improvement in rendertime. For instance:
— 4-poly & 4 texture star globe – 1m 40s
— 1-poly & 1 texture star globe – 30s

A bunch of other improvements are included:
— Fixes the grey blurry line glitch since it uses a Mental Ray texture network.
— A 2k texture for previewing in the Maya viewport. Then 8k texture used for renders.
— Other lights in the scene will not affect the star globe.
— Star globe never casts shadows.
— Star globe will automatically show up in reflections & refractions.
— Renders faster since the 1 texture needs much less initialization and poly is reduced.
— Here is a detailed explanation of these things are achieved.

DOWNLOAD
maya scene – star globe
(also available in the Domemaster toolset)

starglobe_screenshot_20131113

What Tau Sounds Like – Fulldome Short

Sometimes you just gotta act on inspiration. Upon first watching What Tau Sounds Like, I was inspired by the use of multiple camera viewpoints all connected by music. Just being able to see all the layers fit together and understand how one guy made this beautiful song. And I saw the potential for a unique portrayal of the music and math visually intertwining…

So as an initial music show sketch, I re-edited the video piece for an immersive fulldome environment. While I generally don’t like to put rectangle video footage on the dome, well I figured I should break my own rules and see how far I could push the idea. It was an interesting first experiment in compositing video footage into an immersive experience.

What Tau Sounds Like was originally created by Michael Blake. The song is available on the iTunes store. I did not record the video or audio, I just adapted and enhanced it for the dome. Tools used in After Effects: Navegar Fulldome Plugin, Escher’s Droste Plugin, and Fractal Explorer Plugin. The footage of the children running is originally from a PSA: Save the Children – Running Around the World.


1k Quicktime Freely Available

  • Available for planetarium use. 1k Quicktime MOV available for download by request (560MB). Please contact me if interested.

Screenings

USA Planetariums
— Fort Collins Museum of Discovery Planetarium (Fort Collins, CO)
— Sudekum Planetarium, Adventure Science Center (Nashville, TN)
— Maynard F. Jordan Planetarium, The University of Maine (Orono, ME)
— Arthur Storer Planetarium (Prince Frederick, MD)

International Planetariums
— Immersive Vision Theatre, Plymouth University (Plymouth, UK)