Interviews at IMERSA 2016 – Recent Challenges

It’s been fascinating to see IMERSA evolve and mature over the last few years. And since things are moving so fast, I wanted to document the challenges being faced in the immersive community with a series of interviews.

Last year at IMERSA there were nervous murmurings of VR, but this year there is clear excitement. It’s particularly interesting to see fulldome producers realize that they already possess the tools, skills, and ultra high resolution workflows to create polished VR experiences.

IMERSA 2016: View Presentation Recordings

Select presentations available on the IMERSA Vimeo page. Below are my favs:
How Are Museums & Educators Using VR-AR Today / The VR and AR Explosion
Immersion Expanding: New Opportunities for Immersive Experiences
Challenges and Strategies for Producers
The Future of Immersion
Ambisonics Sound Technology
What’s Never Been Seen: Successful Visualizing for Fulldome Storytelling
Shooting 360 Trials and Successes
Proven Methods for a Faster Render
Visual Immersion for Greater Learning Gains in Digital Domes
Let’s Play: Using Games to Entertain and Educate Audiences in the Planetarium
Real Developments in Virtual Reality

Interviewees
— Jenny Carden / Zenka.org
— Greg Downing / xRez Studio
— Troy Whitmer / Sky-Skan
— Jay Heinz / Morehead Planetarium
— David Merrell / Clark Planetarium
— Ken Ackerman / California Academy of Sciences
— Dan Neafus / Denver Museum of Nature and Science
— Orion McCaw / Roundhouse Productions
— Mark Petersen / Loch Ness Productions
— Jay Lamm / Louisiana Art & Science Museum
— Annette Sotheran-Barnett / Sky-Skan

The Dome Dialogues – Andrew Hazelden

An interview with the man that needs no introduction! Andrew Hazelden and I discuss the many vital production tools that he has been creating for fulldome and VR. We discuss in-depth tools such as: Domemaster3D, PlayblastVR, RocketComp, Domemaster Fusion Macros.

Chapters
0m 6s – Intro
3m 53s – RocketComp
5m 58s – PlayblastVR
17m 8s – Andrew’s History
20m 14s – Domemaster3D
41m 8s – Domemaster Fusion Macros
1h 27m 41s – Maxwell Render Toolbox

Bio
Andrew Hazelden is a visual effects artist and co-founder of Dover Studios. He regularly develops tools, tutorials, and documentation for VR/fulldome production, photography, visual effects, and electronics. He has passion for sharing knowledge and also enjoys writing about hobby experiments he does on the weekends and the tools he uses everyday. A few examples of his wide range of interests include building an underwater ROV, flying a model airplane, compiling a mental ray shader, creating a time-lapse video, or doing stereoscopic 3D photography.

Utilities: Converting Timecode & Frames – Estimate Job Render Time on a Farm

All too often I need to precisely convert between timecode and frames. Or I’ll want a rough time estimate of a render job on the farm. Yet I needed a utility that could allow for intuitive interactivity, GUI creation, and be instantly editable. So I’m using a simple open source solution: Pure Data

Pure Data (aka Pd) is a realtime graphical programming environment which is mainly used in live music performances. But its realtime aspect makes it ideal for my needs. So with some careful planning I applied the math nodes to create tools which automate the conversion process. Basically I’ve created a calculator tailored for animators and filmmakers.

DOWNLOAD
collection of Pd patches

Instructions
— Requires Pure Data to be installed. (supports: Windows, Mac, Linux)
— It’s easy to use. Just click and drag on the number boxes to input your timecode or frames.


Convert Timecode into Frames /// Convert Frames into Timecode
— Assumes you’re working at 30fps.
convert-frames-timecode


Calculate Shot Duration: Input as Timecode or Frames
— Especially useful when adjusting timings between two versions of the animatic and need to figure out the exact amount of time added or removed.
— Assumes you’re working at 30fps.

convert-duration-frames-timecode


Estimate Job Render Time on a Farm
— Most useful for estimating the min/max time untill the render is complete. Which is especially important if you’re worried about bottlenecking your farm and need to prioritize for deadlines.
estimate-render-time


How Does it Work? Lets see the Breakdown
— All the source code is included (within subpatches) and can be edited. So if you’re instead working at 60fps, then you can alter it to your needs.

IMERSA Summit 2016: Presentations We’re Giving

imersa-logo-squareWe are going to be at the upcoming IMERSA Summit and sharing several presentations. With so much that’s been happening lately in the immersive community, it’s bound to be an exciting conference this year. David, Heather, and I will each be on different panels and giving presentations. Hope you can check out what we’ve been working on! More info below.


Panel: Challenges and Strategies for Producers
Thursday, March 17 at 10:45 AM
Update: Watch a video recording of this talk
A team of panelists will discuss questions of importance to producers: What are the biggest obstacles to creating content for immersive media, specifically fulldome? Our panel of producers, with lots of help from the audience, will consider the answers and propose solutions. We want to hear from you! In this lively audience led discussion, we will explore your greatest successes and failures in creating and experiencing immersive media.

— Moderator: David Rabkin (Museum of Science)
— Panelists: Robin Sip (Mirage3D), Annette Sotheran-Barnett (Sky-Skan), Mark Webb (Adler Planetarium), Chris Lawes (Fulldome.pro)


What’s Never Been Seen – Successful Visualizing for Fulldome Storytelling
Thursday, March 17 at 04:15 PM
Update: Watch a video recording of this talk
The script is written. The storytelling is effective. But there are calls for visuals of things that don’t exist yet, or real data representations that have never been visualized. Storytelling for immersive fulldome environments has required producers to take the idea of the “artist concept” to new levels. How do we ensure a successful pipeline between the left-brain expert supplying the input and the right-brain creative implementing the visuals to achieve the desired⎯but most important, accurately told story for the educational goals? Examples of what works… and sometimes what doesn’t.

— Presented by Tom Casey (Home Run Pictures), Jason Fletcher (Museum of Science), Carolyn Sumners (Houston Museum of Natural Science)


Let’s Play: Using Games to Entertain and Educate Audiences in the Planetarium
Saturday, March 19 at 10:15 AM
Update: Watch a video recording of this talk
Experiments with gaming recently done by the Charles Hayden Planetarium at the Museum of Science in Boston, will be presented as well as some plans for the future. We’ll talk about the partnerships we’ve made, the logistics of planning for these events, and some of the technology behind it all. We welcome discussion on what other planetariums have tried, what has worked, and, of course, what hasn’t.

— Presented by Heather Fairweather (Museum of Science)


IMERSA Summit 2016: View Entire Agenda

davidrabkin-jasonfletcher-heatherfairweather
David Rabkin (Planetarium Director)
Jason Fletcher (Science Visualizer)
Heather Fairweather (Science Visualizer)

IMERSA Summit 2016: Announcing Slack

slack-in-the-charles-hayden-planetarium-cropped

Recently I’ve been brainstorming how to better connect the IMERSA crowd. Part of what makes going to the summit so great is the random people you meet. And so we are experimenting with how we can better connect people of similar interests. We are hoping that smartphones can be part of the equation.

So we are using “Slack” to continue the amazing conversations that happen at the IMERSA Summit. It’s both an icebreaker into the community and way for you to keep your ear to the ground. And also, it’s free.

To join the IMERSA Slack group, please request an invitation.


Slack-logo-transparent
After you’ve registered, then be sure to install the Slack app on your smartphone (iPhone or Android). But you can also use Slack on any web browser. Then join some channels relevant to your interests and start by sharing a recent project you’ve worked on.

IMERSA.slack.com

Upcoming Special Events in the Planetarium


EinsteinsPlayground-ASlowerSpeedofLight
Image Source: A Slower Speed of Light

Einstein’s Playground

— Thursday, February 11 /// 7:15pm
— Admission $10
— Gerd Kortemeyer, PhD, associate professor of physics at Michigan State University

Have you ever wanted to experience the complete distortion of time and space as we know it? The Charles Hayden Planetarium has partnered with the MIT Game Lab to immerse you in a virtual special relativity playground where you’ll witness the laws of physics in a completely new way. Using the power of video games, we’ll turn Einstein’s most famous theory from an abstract concept into something you can encounter yourself right here at the Museum of Science. Experience the effects of movement, time, and space as you’ve never been able to before!

Tickets on sale beginning January 28 /// (January 26 for Museum members)



AWorldUnderwater-TheReefsofBelize-KeithEllenbogen_5309952
Image Source: Keith Ellenbogen

A World Underwater: The Reefs of Belize

— Thursday, March 24 / 7:00pm
— Admission Free
— Keith Ellenbogen, award-winning underwater photographer and 2015-16 CAST Visiting Artist at MIT | Allan Adams, PhD, theoretical physicist, associate professor of physics and member of the Creative Art Council at MIT

Take an underwater journey to Glover’s Reef Research Station in Belize and immerse yourself in coral reefs! With images and cutting-edge immersive video captured during their January 2016 expedition, Keith and Allan will tell the story of the Mesoamerican reef ecosystem, the researchers working hard to conserve it, and the innovative MIT course behind the expedition in which students from across the institute (chemists, civil engineers, historians, physicists, and poets) learned the art, technique, and technology of underwater conservation photography. Under the Planetarium’s fulldome expanse, experience the thrills, challenges, and serendipity of wildlife photography and explore the role of visual culture as a catalyst for positive social change on our tiny blue planet.

Advance registration beginning March 10 /// (March 8 for Museum members)



StoriesUnderTheStars-Ari-Daniel-Hubble-2013-17-a-large_web-cropped
Image Source: NASA, ESA, CXC and the University of Potsdam, JPL-Caltech, and STScI

Stories Under the Stars

— Wednesday, April 20 / 7:30pm & 9:00pm
— Admission $12
— Ari Daniel, science reporter

Come to the Charles Hayden Planetarium for an evening of live storytelling, radio, and music under the stars. You’ll hear true stories, both personal and inspired by science, that explore the theme of “Light in the Dark,” all unfolding beneath the canopy of our cosmos. Join the search for light during the earliest moments of your life and from the outer reaches of our universe to the inner reaches of the human heart.

Tickets on sale beginning January 28 /// (January 26 for Museum members)
Hosted by science reporter Ari Daniel and co-produced by Ari and the Museum of Science as part of the Cambridge Science Festival.



SpaceStation-ISSCupola-cropped
Image Source: NASA

Space Station

— Thursday, April 21 / 7:30pm
— Admission $10
— Jared Sorensen, game designer

You wake up inside the cramped confines of a cryosleep chamber. You feel weak and dizzy from a prolonged period in cryonic suspension. What will you do next? Join game designer Jared Sorensen and the Charles Hayden Planetarium team as we break new ground in the Planetarium dome. Inspired by the text-parsing games of the ’80s, Space Station allows the entire audience to play a single character trying to survive a dangerous situation… in space! Give commands, explore rooms, examine objects, and try to escape the Space Station, if you can!

Check out the Parsely website for more information about their series of text-based adventure games.

Tickets on sale beginning January 28 /// (January 26 for Museum members)
Part of the Cambridge Science Festival.



CosmicLoops-IanEthanCase
Image Source: David Rabkin

Cosmic Loops

— Wednesday, May 18 / 7:15pm
— Admission $15
— Ian Ethan Case, acoustic double-neck guitars, fretless guitar, live looping | Stephanie Case, live sound design | Bertram Lehmann, percussion | Jeff Willet, gongs and percussion

As you soar through nebulas, galaxies, and star systems in the immersive space under the dome of the Charles Hayden Planetarium, live music with simple beginnings builds layer upon layer into an intricate universe of musical loops created by masters of an evocative style. Acoustic double-neck guitarist Ian Ethan fluidly combines a staggering variety of self-invented playing techniques necessitated by his multilayered compositions, further expanded using real-time live looping technology. Indulge in this rare quartet performance in which gongs and exotic percussion instruments from around the world take Ian’s latest compositions into new dimensions, with the Planetarium team’s transcendent visions overhead.

Tickets on sale beginning January 28 /// (January 26 for Museum members)

Using Real Pluto Imagery – From Dream to Discovery: Inside NASA

In producing From Dream to Discovery: Inside NASA we made the exciting but perilous decision to include the New Horizons mission within our story. So we made an early bet that the mission would be a success…

As you well know, New Horizons has given us an amazing close-up look at Pluto. And so we are excited to announce that we have updated the show to include the latest real images of Pluto and Charon!

Collection of 360° Video Rigs

360-video-rig-collection

360° video is growing by leaps and bounds. There is no doubt about it.

And it’s fascinating to see all the different approaches to capturing 360° video. So I surveyed the current 360° video rigs being offered and then organized every serious option into this epic listing. The results are telling…

If you’re new to 360 video, then you have much to wrap your mind around. I suggest checking out my blog post: 360 Video Fundamentals. Or to gain a comprehensive understanding check out the Making360 open source book.

CATEGORIES
Comparison of 360° Video Rig Categories
360° Map Projection
360° Video Rigs: Multi Camera
360° Video Rigs: Single Camera Body
360° Video Rigs: Stereoscopic
360° Video Rigs: Stereoscopic / Single Camera Body
360° Video Rigs: Scuba
360° Video Rigs: Scuba Stereoscopic
360° Video Rigs: Invisible Drone
360° Video Rigs: First-Person POV
Partial 360° Video Rigs
Cylindrical Video Rigs
Cylindrical Video Rigs: Stereoscopic
Cylindrical Video Rigs: Parabolic Mirror
360° Light Field Rigs
360° Photography Rigs
Fisheye Video Rigs
Fisheye Video Rigs: Stereoscopic
360° Video Rigs: Less than 30fps
360° Video Rigs: Wild and Unique
360° Video Rigs: DIY 3D Printing
Unsuccessful Kickstarter Projects
History of 360° Film

Updated on July 20, 2016


Comparison of 360° Video Rig Categories

There are many different types of 360° video rigs, but not all of them capture the full 360×180° field of view (FOV). And so I’ve placed each rig into a specific category. Sometimes you don’t need to capture absolutely everything. It all depends on how you’re going to use the footage.

For instance, if you’re using a tripod then perhaps you could ignore that footage zone (partial 360°). Maybe the main focus is happening along the horizon, then you might not need to capture the sky and immediate ground (cylindrical). Or maybe you just need to capture the events happening directly in front of you (fisheye). Or maybe you want the big challenge, capturing 3D depth (stereoscopic).

Below I’ve outlined the typical FOV coverage of each rig category: 360°, partial 360°, cylindrical, and fisheye. But these are just averaged examples of each category, sometimes there are outliers which have much higher or lower FOV.

comparison-of-360-video-rig-categories-20151118
Source Image by Andrey Salnikov: “Climbing Volcano Teide”


360° Map Projection

Equirectangular, LatLong, Spherical, 360°… All of these terms refer to the same exact format: the Equidistant Cylindrical Projection. It is currently the most widely used format for stitched 360° video.

So how does it work? This is a precise geometric method for converting a sphere into a flat image with a 2:1 ratio. So this format can be seamlessly wrapped onto a sphere and back again.

360×180° is the standard for expressing a complete spherical capture. But why not 360×360°? Because we are measuring an arc for the latitude, not a complete circle.

360x180_explanation


360° Video Rigs: Multi Camera

These multi camera rigs capture monoscopic 360° video. A majority of producers are currently shooting with these type of rigs.

GoPro: Omni
— full coverage: 360×180°, 6 GoPro cameras
— genlocked
GoPro-Omni

360 Designs: Mini EYE 4
— full coverage: 360×180°, 4 Blackmagic Micro Cinema cameras or Blackmagic Micro Studio cameras (with fisheye lens)
— can be genlocked
360-Designs_Mini-EYE-4

360 Designs: Mini EYE 3
— full coverage: 360×180°, 3 Blackmagic Micro Cinema cameras or Blackmagic Micro Studio cameras (with fisheye lens)
— can be genlocked
360-Designs_Mini-EYE-3

Freedom360: Freedom360 Mount
— full coverage: 360×180°, 6 GoPro cameras
Freedom360-Freedom360-Mount

Freedom360: F360 Explorer
— full coverage: 360×180°, 6 GoPro cameras
— all weather
Freedom360-F360-Explorer

360Heros: PRO6
— full coverage: 360×180°, 6 GoPro cameras
360Heros-Pro6

360Heros: 360H6
— full coverage: 360×180°, 6 GoPro cameras
— all weather
360heros-360h6-rig

360Heros: PRO7
— full coverage: 360×180°, 7 GoPro cameras
360Heros-Pro7

360Heros: PRO10HD
— full coverage:  360×180°, 10 GoPro cameras
360Heros-H3Pro10HD

Freedom360 Broadcaster 3x
— full coverage: 360×180°, 3 GoPro cameras with custom fisheye lens
freedom360-broadcaster-3x

iZugar: Z2X
— full coverage: 360×180°, 2 GoPro cameras with custom 194° fisheye lens
iZugar-Z2X

iZugar: Z3X
— full coverage: 360×180°, 3 GoPro cameras with custom 185° fisheye lens
iZugar-Z3X

iZugar: Z4X
— full coverage: 360×180°, 4 GoPro cameras with custom 185° fisheye lens
iZugar-Z4X

Mooovrig
— full coverage: 360×180°, 5 Canon mirrorless cameras with 180° Samyang fisheye lens
Mooovrig

Kodak: PIXPRO SP360 4k – Dual Pro Pack
— full coverage: 360×180°, 2 PIXPRO SP360 4k cameras
Kodak-PIXPRO-SP360-4k-Dual-Pro-Pack

Elmo: QBiC Panorama
— full coverage: 360×180°, 4 Elmo QBiC MS-1 cameras
— all weather
Elmo-QBiC-Panorama

Freedom360: Elmo360
— full coverage: 360×180°, 4 Elmo QBiC MS-1 cameras
— all weather
Freedom360-Elmo360


360° Video Rigs: Single Camera Body

These cameras capture monoscopic 360° video through the use of a single camera body with multiple sensors/lenses. Since it behaves as a single unit there are less problems that can arise. But they cannot currently obtain the same high resolution as a multi camera rig.

Sphericam 2
— full coverage: 360×180°, single camera body with 6 sensors
— global shutter, genlocked?
Sphericam-2

Bublcam
— full coverage: 360×180°, single camera body with 4 sensors (190° fisheye lens)
Bublcam-Rig

Nikon KeyMission 360
— full coverage: 360×180°, single camera body with 2 sensors (fisheye lens)
— can go underwater up to 30m or 100ft
Nikon-KeyMission360

Ricoh: Theta S
— full coverage: 360×180°, single camera body with 2 sensors (190° fisheye lens)
Ricoh-Theta-S

LG 360 Cam
— full coverage: 360×180°, single camera body with 2 sensors (200° fisheye lens)
LG-360-Cam

Samsung Gear 360
— full coverage: 360×180°, single camera body with 2 sensors (195° fisheye lens)
— can go underwater up to 40m or 131ft (with case accessory)
Samsung-Gear-360

Luna
— full coverage: 360×180°, single camera body with 2 sensors (190° fisheye lens)
Luna-360-camera

Insta360 4k
— full coverage: 360×180°, single camera body with 2 sensors (230° fisheye lens)
Insta360-4k

Orah: 4i
— full coverage: 360×180°, single camera body with 4 sensors
— genlocked?
Orah-4i

GO6D: Trie
— full coverage: 360×180°, single camera body with 3 sensors
GO6D-Trie

Indiecam nakedEYE
— full coverage: 360×180°, single camera body with 2 sensors (fisheye lens)
Indiecam-nakedEYE

Insta360 Nano
— full coverage: 360×180°, iPhone attachment with 2 sensors (210° fisheye lens)
Insta360-Nano-iPhone-attachment


360° Video Rigs: Stereoscopic

Stereoscopic rigs allow for 360° video to be captured for both your left and right eyes, so a true sense of depth can be achieved in VR. 360-3D is the ultimate dream, but there are many challenges that make it difficult to shoot in 360-3D and not give the viewer a very frustrating experience. (Example problems include: parallax error differences between eyes, exposure differences between eyes, genlocking cameras, getting complete stereo coverage without ignoring the poles, and such big headaches.) The terms stereoscopic, 3D, and S-3D can be used interchangeably.

Facebook: Surround 360
— full coverage: 360×180°, 17 Point Grey cameras (1 fisheye lens on top and 2 on bottom)
— global shutter, genlocked
open source stitching software to be released Summer 2016
Facebook-Surround-360

360Heros: 3DPRO12
— full coverage: 360×180°, 12 GoPro cameras
360Heros-3DH3Pro12

360Heros: 3DPRO12H
— full coverage: 360×180°, 12 GoPro cameras
360Heros-3DH3Pro12H

360Heros: 3DPRO14H
— full coverage: 360×180°, 14 GoPro cameras
360Heros-3DH3Pro14H

iZugar: Z6X3D
— full coverage: 360×180°, 6 GoPro cameras with custom 194° fisheye lens
iZugar-Z6X3D

NextVR: Digital Cinema Camera
— coverage: exact FOV unknown, 6 EPIC-M RED Dragon cameras
— global shutter, can be genlocked
NextVR-Digital-Cinema-Camera

HypeVR
— coverage: exact FOV unknown, 14 EPIC-M RED Dragon cameras
— collects LiDAR data with the Velodyne HDL-32E
— global shutter, can be genlocked
HypeVR-Rig

Radiant Images: Codex ActionCam VR 360 Blossom
— full coverage: 360×180°, 17 Codex ActionCam cameras
— global shutter, genlocked?
Radiant-Images-Codex-ActionCam-VR-360-Blossom

360 Designs: EYE
— full coverage: 360×180°, 8 to 42 cameras depending on configuration: Blackmagic Design Micro Cinema Camera (self-contained) or Blackmagic Design Micro Studio Camera 4k (wired operation)
— can be genlocked
360-Designs-EYE

WeMakeVR: Falcon VR Camera
— full coverage: 360×180°, single camera body with 14 sensors
WeMakeVR-Falcon-VR-Camera

Panocam3D: POD
— full coverage: 360×180°, single camera body with 18 sensors
Panocam-HMC


360° Video Rigs: Stereoscopic / Single Camera Body

These cameras capture stereoscopic 360° video through the use of a single camera body with multiple sensors/lenses. Since it behaves as a single unit there are less problems that can arise. Stereoscopic rigs allow for 360° video to be captured for both your left and right eyes, so a true sense of depth can be achieved in VR. 360-3D is the ultimate dream, but there are many challenges that make it difficult to shoot in 360-3D and not give the viewer a very frustrating experience. (Example problems include: parallax error differences between eyes, exposure differences between eyes, genlocking cameras, getting complete stereo coverage without ignoring the poles, and such big headaches.) The terms stereoscopic, 3D, and S-3D can be used interchangeably.

Jaunt: ONE – 24G & 24R
— full coverage: 360×180°, single camera body with 24 sensors
— 24G: global shutter, 24R: rolling shutter, genlocked
Jaunt-ONE

Nokia: OZO
— full coverage: 360×180°, single camera body with 8 sensors (195° fisheye lens)
— global shutter, genlocked?
Nokia-OZO

Vuze
— full coverage: 360×180°, single camera body with 8 sensors
vuze-3d-360-camera

Samsung: Project Beyond
— full coverage: 360×180°, single camera body with 17 sensors
Samsung-Project-Beyond


360° Video Rigs: Scuba

To take a 360° video rig underwater, you can’t simply put it within a glass box… The lens optics would be affected by the refraction of the water. So these rigs have built-in compensation and allow you to capture 360° without any problems.

360Heros: 360Abyss
— full coverage: 360×180°, 6 GoPro cameras
— can go underwater up to 1000m or 3280ft, negative/positive/neutral buoyancy (anodized or poly carbonate versions)
overview of the v4 redesign
360Heros-360Abyss

Kolor: Abyss
— full coverage: 360×180°, 6 GoPro cameras
— can go underwater up to 150m or 492ft, (anodized aluminum alloy)
Kolor-Abyss-Rig

360Heros: H3ScubaH6
— full coverage: 360×180°, 6 GoPro cameras
— can go underwater up to of 61m or 200ft
360Heros-H3ScubaH6


360° Video Rigs: Scuba Stereoscopic

Taking a 360° video rig underwater is a serious endeavor. And shooting in stereoscopic makes it even more difficult.

Vrtul: AquaTerra
— full coverage: 360×180°, 30 GoPro cameras
— can go underwater up to of 40m or 131ft, neutral buoyancy
Vrtul-AquaTerra-Rig-Gen1


360° Video Rigs: Invisible Drone

Attaching a 360° video rig to a drone is easy. But allowing the drone itself to be hidden within the shot is a special trick.

3DR Solo Drone Quadcopter: 360 Mount for the Kodak PIXPRO SP360 4k
— full coverage: 360×180°, 2 PIXPRO SP360 4k cameras
3DR-Solo-Drone-Quadcopter-and-360-Mount-for-the-Kodak-PIXPRO-SP360-4k

Drone Volt: Drone Janus 360
— full coverage: 360×180°, 10 GoPro cameras
DroneVolt_Drone-Janus-360

360Heros: 360 Orb
— full coverage: 360×180°, 12 GoPro cameras
360Heros-360Orb

Spherie
— full coverage: 360×180°, 6 GoPro cameras
Spherie-Drone

Queen B Robotics: Exo360
— full coverage: 360×180°, single camera body with 5 sensors (210° fisheye lens)
Queen-B-Robotics-Exo360


360° Video Rigs: First-Person POV

To tell the story from a first-person point of view, you have to be within their head. These rigs allow you to see though the actors eyes and capture their body movements too.

Radiant Images: Mobius POV VR 360
— full coverage: 360×180°, 17 GoPro cameras
Radiant-Images-Mobius-POV-VR-360

Panocam3D: HMC
— full coverage: 360×180°, single camera body with 24 sensors
— stereoscopic
Panocam3D-HMC


Partial 360° Video Rigs

These rigs are definitely thought of as 360° video because they capture the entire sky and horizon, but the ground isn’t captured (often the tripod).

Freedom360: F360 Broadcaster
— partial coverage: 360×140°, 6 GoPro cameras
Freedom360-F360-Broadcaster

360Heros: PRO6L
— partial coverage: 360×120°, 6 GoPro cameras
360Heros-H3Pro6N

Totavision: Fulldome Camera
— partial coverage: 360×110°, 11 Toshiba IK-HD1 cameras
— cameras are placed around a virtual center. This arrangement allows parallax-free image stitching of distances larger than ~3 meters.
Totavision-Fulldome-Camera

Sphericam 1
— partial coverage: 360×138°, single camera body with 4 sensors (170° fisheye lens)
Sphericam-1

Immersive Media: Hex
— partial coverage: 360×144°, single camera body with 6 sensors
— 15fps at full resolution / 25fps at half resolution
Immersive-Media-Hex

Giroptic 360 Cam
— partial coverage: 360×150°, single camera body with 3 sensors (185° fisheye lens)
Giroptic-360-Cam

PanoptikonVR
— coverage: exact FOV unknown, 14 GoPro cameras
— stereoscopic
PanoptikonVR


Cylindrical Video Rigs

These rigs only capture the horizon. So the sky and the ground are not captured. (But there are some tricks to fill in these empty areas, such as heavily blurring some of footage and stretching into this zone. Or taking a still photo prior to the shoot and patching it in later.)

360Heros: H3Pro7HD
— partial coverage: 360×120°, 7 GoPro cameras
360Heros-H3Pro7HD

Immersive Media: Quattro
— partial coverage: exact FOV unknown, single camera body with 4 sensors
— 15fps max
Immersive-Media-Quattro

Fraunhofer HHI: OmniCam-360
— partial coverage: 360×60°, 10 Micro HD cameras
— global shutter, genlocked?
— cameras are placed around a virtual center. This arrangement allows parallax-free image stitching of distances larger than 1 meter.
Fraunhofer-HHI-OmniCam360

Totavision: Cylindrical Camera
— partial coverage: 360×37°, 8 Toshiba IK-HD1 cameras
— cameras are placed around a virtual center. This arrangement allows parallax-free image stitching of distances larger than ~3 meters.
Totavision-Cylindrical-CameraPalace of Versailles footage / making-of


Cylindrical Video Rigs: Stereoscopic

These rigs only capture the horizon in stereo. So the sky and the ground are not captured. This approach makes dealing with stereo challenges much easier to swallow.

GoPro: Odyssey / Google Jump
— partial coverage: 360×120°, 16 GoPro cameras
— genlocked
— custom stitching service in the Google cloud: The Assembler
GoPro-Odyssey_Google-Jump

Fraunhofer HHI: 3D OmniCam-360
— partial coverage: 360×60°, 20 Micro HD cameras
— global shutter, genlocked?
— cameras are placed around a virtual center. This arrangement allows parallax-free image stitching of distances larger than 2 meters.
Fraunhofer-HHI-3D-OmniCam360


Cylindrical Video Rigs: Parabolic Mirror

This technique has been around for a while. Basically one camera is precisely aimed at a specially crafted parabolic mirror. And so the mirror warps the whole horizon into the camera lens. But the sky and the ground are not captured. Since it only uses one camera, your end resolution is limited… But you don’t have to do any stitching. (Other major problems include: dust magnet, mirror surface quality, irregularly warping of image, image sharpness, and flares.)

VSN Mobil: V.360
— partial coverage: 360×60°, single camera body with 1 sensor
VSN-Mobil-V360

Pano Pro MKII
— partial coverage: 360×120°, lens for a DSLR
Pano-Pro-MKII

0-360 Panoramic Optic
— partial coverage: 360×115°, lens for a DSLR
0-360-Panoramic-Optic

Eye Mirror
— partial coverage: exact FOV unknown, lens for a DSLR
Eye-Mirror

GoPano: Plus
— partial coverage: 360×100°, lens for a DSLR
GoPano-Plus

Eye Mirror: Wet Lens
— partial coverage: exact FOV unknown, lens for a DSLR
— can go underwater (depth rating unknown)
eyemirror-wet-lens

ActionCam360
— partial coverage: 360×90°, attachment for GoPro housing
— all weather
ActionCam360

Eye Mirror: GP 360
— partial coverage: exact FOV unknown, attachment for GoPro housing
— can go underwater up to 50m or 165ft
eye-mirror-gp360

Kogeto: Joey
— partial coverage: exact FOV unknown, single camera body with 1 sensor
Kogeto-Jo

Kogeto: Lucy
— partial coverage: 360×100°, single camera body with 1 sensor
in-depth experimentation
Kogeto-Lucy

Kogeto: Dot
— partial coverage: 360×63°, attachment for iPhone
Kogeto-Dot

BubbleScope
— partial coverage: 360×62°, attachment for iPhone
BubbleScope

GoPano: Micro
— partial coverage: 360×82°, attachment for iPhone
GoPano-micro

Remote Reality: Hummingbird360
— partial coverage: 360×70°, attachment for PointGrey Flea3 or Grasshopper
Remote-Reality-Hummingbird360


360° Light Field Rigs

A 360° light field camera enables virtual views to be generated from any point, facing any direction, with any field of view. Meaning that you can experience Six Degrees of Freedom (6DoF), and you can actually lean into the shot and change your perspective. It is the holy grail of VR.

Lytro: Immerge
— coverage: exact FOV unknown, dense light field camera array
presentation by Jon Karafin (Head of Light Field Video for Lytro)
Lytro-Immerge

Interesting Research
OTOY 360 light field experiment
Axial-Cones: Modeling Spherical Catadioptric Cameras for Wide-Angle Light Field Rendering


360° Photography Rigs

There are a bunch of techniques to capture a 360° photo. But here are some photography rigs which automate or simplify the process.

Panono
— full coverage: 360×180°, single camera body with 36 sensors
Panono

NCTech: iris360
— partial coverage: 360×137.5°, single camera body with 4 sensors (with fisheye lens)
NCTech-iris360

PanoHero
— full coverage: 360×180°, 1 GoPro camera
— stereo version available
PanoHero

Squito
— coverage: exact FOV unknown, single camera body with 3 sensors
Squito

GigaPan: Epic
— motorized drive which automatically captures multi-gigapixel panoramas
GigaPan-Epic

Roundshot: VR Drive
— motorized drive which automatically captures multi-gigapixel panoramas
Roundshot-VR-Drive

BubblePod
— motorized turntable for smartphones, optional clip-on 120° lens
BubblePod

Spinner 360° / Motorizer
— 35mm still film camera for slit-scan photography
Spinner-360

Lomography: Fisheye One / Fisheye Submarine
— 35mm still film camera with built-in 170° fisheye lens (cropped fisheye image)
— can go underwater up to 20m or 65ft
lomography-fisheye-one-camera

Fisheye lens attachments for iPhone/Android
photo mode: partially cropped fisheye image
video mode: mostly cropped fisheye image (video crop amount differs between iPhone & Android)
— lenses such as: Beastgrip, CamKix, iPro, iZZi Gadgets, Mobi-Lens, Olloclip, Optrix, Photojojo, Ztylus

And then there’s this!
Guinness-2003-Worlds-Most-Useless-Invention


Fisheye Video Rigs

Shooting with a fisheye lens means that you’re capturing at least 180° and it’s being projected onto the camera sensor in a circular format. Fisheye footage can be projected directly into a dome and be instantly immersive. But it can also be easily converted into the equirectangular format (spherical).

Kodak: PIXPRO SP360 / PIXPRO SP360 4k
— PIXPRO SP360 has built-in 214° fisheye lens
— PIXPRO SP360 4k has built-in 235° fisheye lens
— can go underwater up to 60m or 197ft (with case accessory)
kodak-pixpro-sp360-4k-camera

360Fly HD / 360Fly 4k
— camera with built-in 240° fisheye lens
— 360Fly HD can go underwater up to 50m or 165ft
— 360Fly 4k can go underwater up to 10m or 33ft
360Fly-4k

Entaniya: Fisheye Lenses
— custom lenses for GoPro cameras with the Back-Bone Ribcage modification
— lenses available at 220°, 250°, 280°
Entaniya-Fisheye-Lenses-220-250-280

Dome3D: GP185
— GoPro camera with custom 185° fisheye lens
Dome3D-GP185

Tamaggo ibi
— camera with built-in 240° fisheye lens
Tamaggo-camera

Entaniya: Entapano C-01
— camera with built-in 183° fisheye lens
Entaniya-Entapano-C-01

Entaniya: Entapano2
— camera with built-in 250° fisheye lens
Entaniya-Entapano2

Digital Cinema Camera Options
— RED Scarlet with fisheye lens experiments: Paul Bourke & Home Run Pictures
— below is a slide from the presentation: “Seeking the Ideal Fulldome Camera” by Jim Arthurs – (IMERSA Summit 2013)
IMERSA2013-SeekingTheIdealFulldomeCamera-JimArthurs

Cheap but Unproven Fisheye Cameras
Eken H8 – 170° fisheye lens
AMKOV AMK-100S – 220° fisheye lens / can go underwater up to 30m or 100ft
DETU Sphere 800 – 236° fisheye lens
CUBE360 GVT100M – 190° fisheye lens / max 28fps
Sunchip Panorama XDV360 – 220° fisheye lens
HDKing V1 Pro – 220° fisheye lens
X360 camera – 190° fisheye lens


Fisheye Video Rigs: Stereoscopic

Shooting with a fisheye lens means that you’re capturing at least 180° and it’s being projected onto the camera sensor in a circular format. But if you shoot with fisheye lenses that are higher than 180°, then you will see the lens itself within the edges of the shots. There are tricks to deal with this, but it’s an interesting challenge.

IX Image: Omnipolar Camera Rig
— 3 cameras with fisheye lens
custom stitching solution to enable stereo stitching without pole region issues
— potential to create 360° video (3 cams facing up, 3 cams facing down)
IX-Image-Omnipolar-Camera-Rig

Lucid Cam
— single camera body with 2 sensors (180° fisheye lens)
prototype adapter allows for 360° stereoscopic video
Lucid-Cam

Tutorial
Building a 3D Camera: Wide-Angle Stereoscopic Video for Cinematic VR


360° Video Rigs: Less than 30fps

For VR and domes, a capture rate of at least 30 frames per seconds is an absolute requirement. Anything less and it’s simply too hard of an experience for the viewer.

ALLie
— full coverage: 360×180°, single camera body with 2 sensors (188° fisheye lens)
— 20fps
allie_camera

Ricoh: Theta m15
— full coverage: 360×180°, single camera body with 2 sensors (180° fisheye lens)
— 15fps, 5 minutes max of video recording
teardown of camera
Ricoh-Theta-m15

Point Grey: Ladybug5
— partial coverage: 360×162°, single camera body with 6 sensors
— 5fps uncompressed / 10fps compressed
— global shutter, genlocked?
in-depth experimentation
Point-Grey-Ladybug5

Point Grey: Ladybug3
— partial coverage: 360×144°, single camera body with 6 sensors
— 6.5fps uncompressed / 16fps compressed
— global shutter, genlocked?
Point-Grey-Ladybug3

Point Grey: Ladybug2
— partial coverage: 360×135°, single camera body with 6 sensors
— 15fps uncompressed / 30fps compressed
— global shutter, genlocked?
Point-Grey-Ladybug2

VideoPanoramas
— partial coverage: 360×162°, single camera body with 3 sensors
— 10 FPS at 5MP / 15 FPS at 3MP / 30 FPS at 1.3MP
VideoPanoramas


360° Video Rigs: Wild and Unique

In my research I’ve stumbled across many unique camera systems. Many of which are fascinating but are either not being used anymore, defy categorization, currently a prototype, or perhaps was a singular creation.

Immersive Media: Dodeca 2360more info
The original cube from Freedom360
Elphel Eyesis4Pi
Canon Vixia / 24 camera rig
Overview One
Panoptic Camera
FullView Camera Rig
Intel Realsense Drone
The Mill – Custom Red Dragon Camera Rig
ADAPA Trino
FascinatE’s Omnicam ARRI Alexa M Rig
IC720
Social Animal: SA9
Sensocto
IPIX Media360
Occam Omni Stereo
Panasonic Dive
Pentax Prototype
Live Capture Using Panoramic Photography with One Camera
6 GoPro Cameras with six Entaniya fisheye lenses
360 video using five Canon M cameras
Calibration of Omnidirectional Cameras in Practice
TENGO2VR Q-1 Mark II
GoPro Cylindrical Helmet Rig
360 Video Helmet Rig
Homemade Helmet Rig
Pi Of Sauron – 3D Printed, Raspberry Pi 360 Video Rig
Making VR Video with the Kodak PIXPRO SP360
Spherecam
Mobius 360
GoPro Session Rig
ADAPA: Nimbus VR
ADAPA: Pulsar VR 360 video camera rigvideo
Aposematic Jacket
Polar Effect: Philon 360 stabilized camera rig (working prototype)video
Nodal Ninja Multi-Cam Pano bracket system
3D printed Xiaomi Yi Rig
GoPro Session: 3D printed rig
Shooting 360° Video in 48K Using 12 Sony Xperia Z5 Smartphones
Nikon Multi-Ball (prototype)
Custom 3D 360 Rig using 13 Xiaomi Cameras
360 rig using six Flare 4KSDI cameras
Tripletcam
ALLie Go
Genlocked GoPro Rig – tested with fast moving footage
10 Camera Canon Vixia / RC car rig
GoPro Omni announcement
Ricoh Theta S and The Beholder gimbal rigexample footage
Handmade 360 rig using SJ400 cameras (x13)
Ammergauer Alpen: 360 video droneexample footage
Brahma 360
Genlocked 7-cam rig mounted on the Mantis by Motion Impossible
Fisheye lens rig
Quantum Leap Pro
Shuoying PDV3600
GO6D: nine new 360 cameras planned
Voke VR camera
DETU F4
Insta360 Nano – 360 camera attachment
Handmade 360VR 2D/3D Rig
Condition One Reveal ‘Bison’ Cinematic VR Camera
The Open 360 Camera Hardware Repository
Ricoh WG-M1: 360 rig
Prototype Back-Bone GoPro with sensor & fisheye lens attached via ribbon cable extension
Prototype iZugar GoPro rig with sensors & fisheye lens attached via ribbon cable extensions
Using the iPhone front and rear cameras to capture 360 video (with fisheye lenses)example footage
Yezz announces Sfera smartphone with a 360-degree camera
Android phone owners can record 360-degree VR video with NeoEye
Pictures Fabryc: Alta8 VR 360 Drone with stabilization
— Visual Effects Society: VR Post Production (includes camera discussions) – p1, p2, p3


360° Video Rigs: DIY 3D Printing

So 360° video isn’t hard enough for you? You want to 3D print your own rig too?

Purple Pill VR – 3D models for 360 mono, 360 stereo, Google Jump
Thingiverse: Cylindrical Stereo Mobius Rig /// in-depth experimentation
Thingiverse: Collection of 360 video rigs
Thingiverse: search for 360 video rigs
Shapeways: search for 360 video rigs
SJ4000 360 Rig
IncreDesigns
Cardboard 360 video rig

Custom Mounting Base for the Kodak PIXPRO SP360 4k rig
Stereoscopic 360 Mounting Base
Dual mounting base & bracket (includes USB & HDMI access port)
Dual mounting base (includes USB & HDMI access port, tripod mount)
SP360 to GoPro mount converter


Unsuccessful Kickstarter Projects

Sadly these Kickstarter projects weren’t funded since they didn’t reach their goal. But they are unique and deserve to be recognized.

Blocks Camera
— modular camera rig with 4 sensors
Blocks-Camera

Centr Cam
— partial coverage: 360×56°, single camera body with 4 sensors
Centr-Cam

Shot
— iPhone attachment with dual 235° fisheye lenses
Shot-iPhone-attachment

Occube
— full coverage: 360×180°, 6 GoPro cameras
Occube

Lensbaby Circular 180+ for GoPro camera
— GoPro housing attachment with 185° fisheye lens
— plug and play (no camera modding required)
Lensbaby-Circular-180-for-GoPro-Camera


History of 360° Film

The concept of capturing a huge panoramic perspective isn’t a new one. There are been some fascinating projects early in film history. And only now is that dream being fully realized.

Early Cylindrical Film History
Cinéorama (1900 Paris Exposition)
Lumière Photorama (1902)
Circarama (Disneyland)

Early 360° Video
Page of Omnidirectional Vision
Dynamic Surround Video
Shooting 360-degree video with four GoPro HD Hero cameras
Canon 5D Mark II fisheye rig

360Heros-HungryShark
Photo Source

Blueprint to Blastoff: Free Engineering Materials for the Planetarium or Classroom

Talia-Bio-PhotoWe are offering 3 distinct educational modules, focusing on aspects of spacecraft engineering, to anyone with a planetarium or classroom who would like to use them. They supplement, but are independent of our newest show From Dream to Discovery: Inside NASA and are being shared free of charge.

This article was written by Talia Sepersky. She currently works as a planetarium educator at the Charles Hayden Planetarium, Museum of Science, Boston.

SECTIONS
Intro: Putting the “E” back in “STEM”
Module 1: Fixing the Hubble Space Telescope
Module 2: Gravity And Space Travel
Module 3: Design a Mission
The Guides
4k Downloads
Teacher Bundles


Intro: Putting the “E” back in “STEM”

When it comes to STEM, planetarium shows tend to be very good at covering the science, technology, and even the math portions, but engineering often gets left out. To help fill this void, in 2013 we, the staff of the Charles Hayden Planetarium at the Museum of Science, Boston, teamed up with NASA to make a planetarium show about spacecraft engineering. The result of this partnership is the show “From Dream to Discovery: Inside NASA,” which explores what it takes to design, test, build, and fly a successful space mission.

As much as we would have liked to, we could not talk in detail about every part of spacecraft engineering during the show. However, through the partnership with NASA, we were able to expand on a few engineering topics from the show in three separate, supplementary education modules. We are extremely pleased to be able to offer these modules to anyone who wants to use them completely free of charge.

The modules themselves have three very different lengths, styles, and topics, and are designed to be presented in different ways. They can be used on a planetarium dome, and a flatscreen version permits their use on a conventional screen as well. Although each goes into depth on topics that are raised in “From Dream to Discovery: Inside NASA,” they all stand on their own and require no knowledge of the show itself. The three modules are: “Fixing the Hubble Space Telescope”, “Gravity and Space Travel”, and “Design a Mission”.


Module 1: Fixing the Hubble Space Telescope

We’ve found that many people in our audiences know that there was something wrong with Hubble when it launched, and that it was eventually fixed. However, few people tend to be aware of the details. The first of our modules, “Fixing the Hubble Space Telescope,” goes into some of those details. It’s the most straightforward of the three modules, consisting of a single video approximately eight minutes long. Large portions of the narration are undertaken by Dr. Jeffrey Hoffman, a former astronaut who flew on the first Hubble servicing mission.

With this module we wanted to focus on a specific case of spacecraft engineering, and Hubble Servicing Mission 1 provides a fantastic real life example. We also wanted to bring in the idea that failures can be instructive.

This module starts by introducing Hubble in space, and then describing how astronomers realized the telescope had a flaw, using some of Hubble’s earliest observations to make the point. It then takes Hubble apart to show the primary mirror and allow Dr. Hoffman to describe exactly what went wrong with making it.

While still looking at a cutaway view of Hubble, Dr. Hoffman goes on to explain the “fix” designed by engineers to repair Hubble, describing the arrangement of mirrors that allowed light entering Hubble’s tube to be refocused before landing on the detection instruments. While he is providing the narration, the visuals show this in action, following a light path all the way through Hubble to the instruments.

The module then moves on to the installation of the new optics on Hubble, with Dr. Hoffman talking about the work on the shuttle mission. This is accompanied by visuals of Hubble and the space shuttle in space, as well as actual video clips from the mission. In one of our favorite parts of this module, Dr. Hoffman shares his story of receiving the phone call that let him know the fix had worked, as well as some thoughts on what it felt like to actually touch Hubble. Some of the visuals for this portion include Hubble images, comparing pictures of the same objects before and after the repair.

The module concludes with the idea that we can learn from failures like Hubble’s. To quote Dr. Hoffman at the module’s end, “The important thing, though, is if you do have a failure, you really need to be able to learn from it. To have a failure that you don’t learn anything from, that’s tragic.”


Module 2: Gravity And Space Travel

It turns out that describing what goes on during a gravity assist can be tricky business. This module introduces some of the mechanics of the momentum transfer that happens during a gravity assist maneuver through Earth-based and space-based examples, as well as describing some of the various ways gravity assists can be used in a space mission.

Since gravity assists can be a tough subject to teach and the depth a presenter goes into will vary widely with different audiences, we designed this module to be as flexible as possible. It is broken up into five segments, each about 1-2 minutes in length (for a total of about 7 minutes of video). Each segment can be presented independently of the others if the presenter only wants to use some but not all. They can also follow after each other, with each segment building on the one before.

We created this format with the idea of using live interpretations in between each of the segments, to reiterate or emphasize the content covered in the previous segment and set up for the next one. However—maximum flexibility!—they can also be strung together to create one unbroken video, depending on the presenter’s preferred style. The core ideas behind momentum transfer and gravity assists are presented in segments 2 and 3, so our recommendation is that at least these two be used.

Segment 1 is relatively straightforward. It starts with the idea that spacecraft travel is often not as easy as pointing the spacecraft at its destination and giving it a push. It introduces the terms “gravity assist” and “momentum transfer” and also defines the word “momentum.”

Segment 2’s purpose is to help the audience gain a better understand of the transfer of momentum using an Earth-based example. To this end, we enlisted the help of a local roller derby team. We wanted to emphasize the idea that gravity assists work not just because the planets are large (i.e. have a lot of gravity) but because they are also moving (i.e. have a lot of momentum).

For this, we had one skater (designated Skater One) hold still and whip her teammates around her as they approach. While her teammates’ paths change, their speed remains more or less the same. We then recreated the same scenario with Skater One also in motion. This time, when she whips her teammates around, their speed increases noticeably even as Skater One’s decreases, due to the momentum transfer between them.

Segment 3 builds on the Earth-based example with a space-based one, specifically the New Horizons gravity assist flyby of Jupiter in February 2007. It starts by looking at what would have happened if New Horizons had gone directly from Earth to Pluto, then looks at the Jupiter flyby. The visuals show an overhead view of New Horizons approaching Jupiter and then visibly increasing its speed as it flies past. This segment uses some actual numbers to get across how much momentum Jupiter has to spare and to emphasize the fact that the planet is, for all practical purposes, not actually affected by losing some. It ends by describing the changes in New Horizons’ speed and flight time as a result of the flyby.

Since Segment 3 presents how a gravity assist can be used to speed a spacecraft up, Segment 4 explores how one can be used to slow a spacecraft down. It shows how the angle at which a spacecraft approaches a planet determines whether the planet transfers momentum to the spacecraft (to speed the spacecraft up) or the spacecraft transfers momentum to the planet (to slow the spacecraft down). It also re-emphasizes the idea that, no matter what the spacecraft does, it will have no practical effect on the planet.

The final segment, Segment 5, brings up the use of multiple gravity assists in a single mission, requiring careful planning many years in advance. To conclude, it loops back to the idea raised in Segment 1 that many space missions are only possible with the use of gravity assists (showing some of the rather convoluted paths these missions took), and that by making clever use of them we have vastly expanded our knowledge of the Solar System.


Module 3: Design a Mission

The “Design a Mission” module is the most interactive of the three and requires a live presentation. In this activity the audience, using information provided to them by the presenter, designs a spacecraft to search for signs of water in the Solar System. They have to choose a destination and then, based on that destination, a power source and whether their spacecraft will be a lander or orbiter. If they design their spacecraft well to suit their destination, the mission will succeed. If they do not, the mission will fail (and how it fails depends on the spacecraft design).

The module itself is made up of thirteen video clips to incorporate all the possible outcomes of the audience’s decisions. In total, the video clips make up about 35 minutes of footage, but a presenter should only need a fraction of that during any given presentation.

The first clip represents the audience’s first decision: will their spacecraft travel to Mars or Saturn in search of evidence of water? The visual for this clip is fairly basic, with images of both of those planets on the screen.

Once they’ve chosen the destination, the second clip represents the audience’s next decision: will the spacecraft be an orbiter or a lander? The presenter may want to provide the audience with some of the benefits and disadvantages of each, or ask the audience to come up with some on their own. The visual is of the two different styles of spacecraft. The “lander” option is based roughly on Cassini with a Huygens-style lander attached to its side.

The third decision is whether to make the spacecraft solar or nuclear-powered, and there are two clips that can potentially be used depending on whether the audience chose an orbiter or a lander. If they chose “lander,” the corresponding clip shows two versions of the lander-style spacecraft, one with solar panels and one without (the nuclear reactor is visible on the bottom edge of the nuclear-powered spacecraft, but is small and not immediately obvious like the solar panels). If they chose “orbiter” the visual is the same, with the orbiter-style spacecraft instead. Again, the presenter may want to make sure the audience knows the benefits and drawbacks of each choice.


Now that they have designed their spacecraft, it’s time to send it to the chosen planet and see if it succeeds. There are eight different clips to represent the eight possible outcomes of the audience’s choices. All start with a liftoff from Earth and a view of the spacecraft moving towards its destination. What happens once it starts moving depends on how well the spacecraft was designed.

The four Mars scenarios (nuclear orbiter, nuclear lander, solar orbiter, and solar lander) all succeed. The two lander scenarios make use of the landing sequence of the Curiosity rover for visuals. The landers will find evidence for water in the form of “blueberries,” frost, and silica deposits. The orbiters will find evidence of water from seeing river channels, hydrogen deposits, and rampart craters.




It’s much harder to succeed at Saturn, and only one scenario, the nuclear-powered orbiter, will lead to success. If the audience chose a solar-powered spacecraft, then as it moves through space towards Saturn the picture will turn to static to represent the spacecraft losing power and shutting down. If they chose a nuclear-powered lander, they will see a rather stunning sequence of their lander entering the atmosphere, heating up, and exploding. If they chose a nuclear-powered orbiter, they will find evidence of water in the geysers on Enceladus and in Saturn’s E Ring.




Since not all of the mission designs succeed, the presenter may wish to talk about failure in spacecraft engineering. To this end, we wanted to show audiences that the professionals also sometimes don’t get it right. The final clip shows images from four real life failed missions from different countries, specifically the Vanguard rocket, the Mars Climate Orbiter, the Phobos-Grunt mission, and the Akatsuki mission. As with the end of the “Fixing Hubble” module, the idea is to emphasize that failures happen, and that the important thing is to learn from them when they do.


The Guides

Between them, these three modules present a lot of information, some of it very specific. To make them as easy as possible for a large variety of institutions to use, we’ve also created planetarian guides to go with each. Our hope is that a presenter with no background in any of these three topics can make an effective presentation on any or all of them using just the material found in the corresponding planetarian guide. In addition to the script for the module, a set of FAQs, and a glossary, each guide contains copious background information as well as some suggestions for presentation.

The “Fixing Hubble” guide includes a layout of Hubble’s optics, even more detail about the flaw and how it was fixed, a brief breakdown of each of NASA’s Hubble servicing missions, and a list of Hubble specifications.

The “Gravity and Space Travel” guide goes into greater detail about the mechanics of gravity assists, how momentum is transferred, and why the spacecraft’s trajectory changes. It also looks at the usefulness of gravity assists on specific missions and provides a list of missions that have made notable uses of gravity assists. In the script section, it provides some guidelines for live interpretation in between the video segments as well instructions on how to recreate the roller skater demo from Segment 2 in house, using either staff or audience members.

The “Design a Mission” guide includes specific descriptions of each of the visuals in the clips and what they are designed to represent. There is an outline for the progression of the module, with some guidelines for discussion, background information on the pros and cons of landers, orbiters, solar power, and nuclear power, and a description of why each mission succeeds or fails. There is also a list of all of the video clips included with this module.

Separate from the planetarian guides, there is a set of educator guides for teachers using the modules in a standard classroom setting. The educator guides are geared more towards using the modules as part of a lesson in a school environment rather than a presentation in a planetarium show, and the information they include is not as detailed as that in the planetarian guides. There are also educator guides for topics not included in the modules, including “Waves and Information Transfer” and “Infrared Astronomy,” which also expand a bit on topics raised in the show “From Dream to Discovery.”


4k Downloads

To ensure that many different institutions, classrooms, and other settings can make use of our modules, we are offering them in a variety of formats. The modules are all available in 1K, 2K, and 4K fulldome versions for planetarium domes. There are also flat versions available for use in standard classrooms or for anyone using a flatscreen projector (complete with captions).

4k domemaster downloads are available on the ESO Fulldome Archive.


Teacher Bundles

The Teacher Bundles for “Fixing Hubble” and “Gravity Assist” include the flatscreen captioned versions of the modules as well as the educator guides. The classroom version of “Design a Mission” is web-based, so the Teacher Bundle for that module includes the educator guide and link to the web-based activity. The modules page also includes a Teacher Bundle with the “Waves and Information Transfer” and “Infrared Astronomy” educator guides.


Copyright 2015 International Planetarium Society; article used with permission.

This material is based upon work supported by NASA under grant number NNX12AL19G. Any opinions, findings, and conclusions or recommendations expressed are those of the Museum of Science, Boston and do not necessarily reflect the views of the National Aeronautics and Space Administration (NASA).

DJ Spooky: The Hidden Code – Performing in the Dome

My love for live music in the dome is undeniable. The idea is simple but powerful: Allow the synthesis of live performance and astronomy visuals to create a uniquely awe-inspiring experience.

Having thrown a series of live music events, each with its own custom dome visuals, we now have a collection of 4k dome material. So when DJ Spooky approached us with the idea of partnering to create a live fulldome show, it felt like a natural match. And the premiere of the show is just a few weeks away.

DJ Spooky: The Hidden Code — tickets
Performing at the Charles Hayden Planetarium, Museum of Science
Thursday, September 24, 2015 — 7:00–9:00 pm

DJSpookyLive_TheHiddenCode


Now Booking Dome Performances

After the premiere of the album and fulldome show is when things get interesting for you. DJ Spooky is looking for domes to perform in! For live bookings please contact Sozo.

We have also created a canned version of the show which is meant to compete with evening laser shows. If you’re interested in licensing the show, then please contact me.

4k Fulldome Show Optionsfulldome trailer
— Live performance (visuals split by song)
— Canned show (45 minute show)

Flat Theater Options [16:9 ratio]flat trailer
— Live performance (visuals split by song)
— Canned show (45 minute show)


More Info About The Hidden Code Album

Imagine a visual odyssey through the cosmos, driven by lush musical compositions and inspired by complex themes of astronomy, engineering, biology, and psychology. The Hidden Code is the newest work by Paul D. Miller, aka DJ Spooky. Commissioned by Dartmouth College’s Neukom Institute for Computational Science, Miller composed the album based on conversations with several of Dartmouth’s leading researchers.

The album features Dartmouth theoretical physicist and saxophonist Stephon Alexander; and Dartmouth physicist and author Marcelo Gleiser who reads his original poetry.

Check out the free online streaming of The Hidden Code album.

Savor this synthesis of emerging science, poetry, and melody with immersive visions overhead as The Hidden Code pushes art into science. Science into music. Music into art.

Paul D. Miller, aka DJ Spooky, is a composer, music producer, performer, multimedia artist and National Geographic Emerging Explorer. He has collaborated with an array of recording artists, from Metallica and Chuck D to Steve Reich and Yoko Ono. He is the author of Imaginary App, Rhythm Science, Sound Unbound and Book of Ice.

Stephon Alexander is a theoretical physicist, tenor saxophonist and recording artist. He specializes in cosmology, particle physics and quantum gravity. He is the Ernest Everett Just 1907 Professor of Natural Sciences at Dartmouth and a National Geographic Emerging Explorer.

Marcelo Gleiser is a theoretical physicist specializing in particle cosmology. He is the author of The Island of Knowledge, A Tear at the Edge of Creation, The Prophet and the Astronomer, and The Dancing Universe. He is the founder of NPR’s blog 13.7 on science and culture. He is the Appleton Professor of Natural Philosophy and Professor of Physics and Astronomy at Dartmouth.

TheHiddenCode_ShowPoster


Press

NPR – The Hidden Code: An Embrace Of Art And Science
Sound of Boston – Interview
Dartmouth – DJ Spooky Album Explores Universe With Dartmouth Scientists