In recent years, the 3D industry went from very little support of fisheye rendering and now every modern render engine fully embraces it. I think we can thank the rise of VR for helping to shine a light on immersive production.
We recently had a chance to upgrade the render engine that we use in Maya for producing our fulldome shows. We were also able to add some computers to our old 10-node render farm. Many thanks to the Charles Hayden Foundation for the grant that made this vital tech leap possible.
With the recent introduction of GPU render engines, things are a bit more convoluted when considered alongside CPU render engines. Yet each render engine is unique and excels at different aspects, so it makes comparing them tricky work.
And so I thought it would be useful to hear from the fulldome community…
We recently traveled to NASA’s Kennedy Space Center to gather reference imagery for our planetarium show production about human spaceflight to Mars. What an epic trip!
It was surreal to walk inside of the Vehicle Assembly Building and travel up to the 37th floor of this incredibly huge single-story building. Currently the VAB is being retrofitted so that the SLS rocket can be vertically assembled within. This retrofit includes special motorized platforms which can be moved into place according to the needs of the project at hand.
During the 2017 Spring semester at the Massachusetts College of Art and Design, students explored the topics of memory, transformation, and the outsider. In less than 5 months these students collaborated on all aspects of storytelling, concept development, video shoots, surround sound design, and 4k fulldome production.
In January 2016, Allan Adams and Keith Ellenbogen took a group of MIT students scuba diving in Belize as part of a college course on underwater conservation photography. Coral reefs worldwide are deteriorating due to changes in our climate and so it’s important to document both the beauty of our oceans and what’s happening to them. Capturing this moment in time is important for future generations to learn from, be immersed in, and be inspired from.
Lately I’ve been shooting with the Kodak PIXPRO SP360 4k camera for both VR and the planetarium dome. Shooting with dual cameras is great for capturing 360°, but shooting with a single camera is sometimes easier since the footage doesn’t need to be stitched. Also a single camera captures 235° which is a surprisingly huge FOV. Yet it’s necessary to warp the footage from fisheye to spherical so that it can be experienced in VR or Youtube 360.
The ‘Pixpro SP360 4k’ software is actually capable of warping a single camera from fisheye to spherical. But it’s not intuitive (here is a tutorial) and the Kodak software can only export footage to MP4… And seeing as how the raw camera footage is an already heavily compressed MP4, I wasn’t thrilled about this added lossy step. So I figured out a simple technqiue.
It’s been fascinating to see IMERSA evolve and mature over the last few years. And since things are moving so fast, I wanted to document the challenges being faced in the immersive community with a series of interviews.
Last year at IMERSA there were nervous murmurings of VR, but this year there is clear excitement. It’s particularly interesting to see fulldome producers realize that they already possess the tools, skills, and ultra high resolution workflows to create polished VR experiences.
All too often I need to precisely convert between timecode and frames. Or I’ll want a rough time estimate of a render job on the farm. Yet I needed a utility that could allow for intuitive interactivity, GUI creation, and be instantly editable. So I’m using a simple open source solution: Pure Data
Pure Data (aka Pd) is a realtime graphical programming environment which is mainly used in live music performances. But its realtime aspect makes it ideal for my needs. So with some careful planning I applied the math nodes to create tools which automate the conversion process. Basically I’ve created a calculator tailored for animators and filmmakers.
collection of Pd patches
— Requires Pure Data to be installed. (supports: Windows, Mac, Linux)
— It’s easy to use. Just click and drag on the number boxes to input your timecode or frames.
We are going to be at the upcoming IMERSA Summit and sharing several presentations. With so much that’s been happening lately in the immersive community, it’s bound to be an exciting conference this year. David, Heather, and I will each be on different panels and giving presentations. Hope you can check out what we’ve been working on!
Image Source: A Slower Speed of Light
— February 11, 2016 /// 7:15pm
— Gerd Kortemeyer, PhD, associate professor of physics at Michigan State University
Have you ever wanted to experience the complete distortion of time and space as we know it? The Charles Hayden Planetarium has partnered with the MIT Game Lab to immerse you in a virtual special relativity playground where you’ll witness the laws of physics in a completely new way. Using the power of video games, we’ll turn Einstein’s most famous theory from an abstract concept into something you can encounter yourself right here at the Museum of Science. Experience the effects of movement, time, and space as you’ve never been able to before!
In producing From Dream to Discovery: Inside NASA we made the exciting but perilous decision to include the New Horizons mission within our story. So we made an early bet that the mission would be a success…
As you well know, New Horizons has given us an amazing close-up look at Pluto. And so we are excited to announce that we have updated the show to include the latest real images of Pluto and Charon!