360° Video Fundamentals

I’ve long been excited of the possibility of 4k video in a planetarium dome. And so I was captivated with the recent introduction of a 360° video camera rig with 8192×4096 resolution. (Which translates to 4k domemaster resolution.) It also meant that I could increase the fisheye FOV from 180° to 220° and see the immediate ground surrounding the camera. In my opinion this makes for a heightened immersion experience. So I have spent the last two months experimenting and learning directly about the intricacies of shooting 360° video.

The 360Rize PRO10HD is a 3D printed object. Meaning it’s one solid piece of plastic that is precisely engineered to fit 10 GoPro cameras into the smallest possible space. It’s printed using aircraft grade plastic, so it’s durable and has been through a strenuous bend test to prove it’s strength over time.

Currently the 360° video community is tiny and little documentation is available. So I was on my own to figure out the potential problems, shooting subtleties, and overall workflow. This can be a tedious and nerve-wracking process. After all, with 10 GoPro cameras shooting in unison, something is bound to go wrong at some point. So alas, plan within plans within plans, theorize contingencies, and take notes of your experience. And now for you brave souls remaining, below are my own findings, tips, and thoughts.

Update: December 13, 2015 – Recently I have been contributing to the Making360 open source book. It’s a collection of solutions and illustrations of common problems in producing immersive video experiences. So if you find this blog post helpful, then Making360 will give you a comprehensive understanding.

Hardware Rundown

Camera & Memory
— PRO10HD Rig
— 360Rize Aluminium Mount
— GoPro HERO3+ Camera – Black Edition (x10)
— Lexar 32GB microSDHC 600x [LSDMI32GBSBNA600R] (x10)
— Manfrotto 496RC2 Ball Head with Quick Release
— ALZO Ball Head Camera Tripod (the included ball head isn’t great, hence the Manfrotto ball head above)
— Manfrotto 200PL-38 RC2 Plate

Extra Batteries & Charging
— Wasabi Power Battery [2-Pack] and Charger (x10)
— PowerSquid D080B-008K (x2)
— 7-Port USB Charger (x2)

— Pelican 1550 Case with Foam
— Fotodiox GT-H3Lens-Cap GoTough (x10)

Stitching & Rendering
— Autopano Giga & Autopano Video Pro [known as AVP in this blog post]
— GeForce GTX 770 4GB
— 360Rize File Data Manager

Big Picture Workflow

There are different mindsets and considerations depending of what action you are performing. So I’ve split up up this information into 4 main sections.

Setup the Rig

  1. Remove lens caps
  2. Check lens for dust
  3. Put camera’s into rig (matching camera # to rig #)
  4. Press wifi-button on each camera
  5. Check that each camera’s recording settings are correct: Example Display

It’s a bit of a trick to get all the GoPro cameras installed without touching the lenses. But having the plastic rig locked onto a tripod helps. I chose to permanently install the aluminium mount so that 7 of the GoPros are looking at the horizon. Since I shoot for a dome, this can ease stitching pains later on since often the objects of interest are in these camera zones. This leaves 1 camera pointing straight up and 2 pointing down at the tripod. Interestingly, if a camera fails then I can trade it out for 9 or 10 and still not lose any essential zones. So there is some redundancy there. I could also remove 9 & 10 altogether and still capture 280° of footage.


Also, removing the cameras from the rig is difficult due to the tightness of the plastic camera clamps. But this isn’t a complaint, it’s actually a good sign that everything is snug and won’t wobble during shooting. So to remove the cameras you must use a small flathead screwdriver (included with 360Rize rig) to softly wedge the plastic tab up from its locked position.

It’s important to number each of the cameras and the memory cards with a sharpie. If a camera begins to act strangely then I can easily trace it back to the source. It also allows me to understand exactly where a camera is within the rig, which can be helpful for when the automatic stitching doesn’t work. This is rare, but can happen if the video frame is of a pure blue sky with no overlapping content to match up.

The first time you setup the cameras, you will need to setup the default video recording settings for each camera by hand. From that point forward the cameras will remember those settings. (unless you remove the battery overnight, then the settings will reset to default!) It is absolutely vital that each camera has the exact same settings.

Camera Settings for Best Resolution (with the PRO10HD Rig)
— Protune: On
— Resolution: 2704×1524
— Aspect Ratio: 16:9
— FPS: 30
— Low Light: Off
— White Balance: Cam RAW


Shooting Workflow

  1. Power ON the cameras with the wifi remote
  2. Check each camera for corrupt symbol (rare occurrence)
  3. Press Record on wifi remote
  4. Quickly check each camera for red blinking recording lights
  5. Motion sync the cameras. Twist tripod stem quickly. VERY IMPORTANT!

Whats the need for motion sync? Well the wifi remote doesn’t trigger all the cameras at the same moment. So instead AVP must sync each of the video clips itself, otherwise your footage could be offset on the timeline and look terrible. There are two options: audio sync or motion sync; but motion sync is more precise. All you have to do is quickly spin the whole rig back and forth to provide the necessary data calibration for AVP to use later. The audio sync needs a sharp fast snap to sync to, like a dog training clicker.

A wonderful aspect of using GoPro cameras is that one wifi remote can trigger all 10 cameras. So they can be remotely powered ON/OFF and record start/stop. The remote indicates how many cameras are connected.

Because each of the camera lenses are not at the same nodal point, you’re undoubtedly going to see parallax errors when stitched. There are some techniques to address this in AVP, but it’ll take time and hand polishing. But to combat this upfront, it’s wise to constantly be thinking about parallax when planning a shot. My suggestion is to keep the camera rig at least 10 feet away from everything. That can be tricky and not always an option, but it’s the guideline I try my best to follow. Know that the larger the environment that you shoot, the easier your stitch job will be.

The most difficult aspect of shooting for a dome is keeping the camera movement buttery smooth. Any subtle bumps in the footage will be greatly amplified on the dome. I tested a wheelchair and the motion was smooth but not perfectly parallel when moving forward. I also tested a motorized wheelchair and that produced satisfying results when used at the slowest setting. But the real perfection is with the use of a tracked dolly. Yet then you run into the issue of seeing the track within the 360° shot. So I used the Singleman Indie-Dolly with 12 feet of track and that was just short enough to not show within 240°-ish of the footage. It’s incredibly smooth and the smallest movement forward looks stunning in the dome.


Perhaps the most frustrating aspect are the GoPro cameras themselves. GoPro’s are not exactly known for their dependability. Some things to watch out for:

  • Camera battery life leaves me wishing for more. 1h 30m of shooting time is pretty good, but I haven’t timed it when continually turning ON/OFF the whole rig. So I have a total of 30 batteries charged and ready. I definitely turn off the cameras when not recording, as they eat battery life when in stand-by mode. I’ve heard that changing the settings from the default 3 red blinking recording lights and instead seeing only one red blinking recording light can help to save some battery life too. Leaving on the wifi overnight (blue blinking lights) will drain a battery life in 24 hours.
  • Be sure to use microSD cards that are recommended by GoPro. Some of the other brands don’t write as fast as they advertise and this can lead to the camera stopping recording randomly due to a full buffer. I’ve had good results with the Lexar 32GB microSDHC 600x and the footage files have a constant data stream of 45mbits/sec, with no skipped frames. But in the future I’d suggest the 64GB version. Recording with 2.7k settings will allow you to record 1h 30m of footage. So using a 64GB card would double that. Dumping footage takes about 1h 30m of diligent work, so if you’re in the middle of a big shoot, the last thing you want to do is be interrupted. I’ve had absolutely no problems with the microSD cards. But if you have an “SD error” on the GoPro, accidentally deleted footage, or the microSD card is unreadable, then check out Test Disk.
  • Occasionally a video file will corrupt. You’ll know because the camera display will show the corrupt symbol and you must push a button to continue. So this is something that I’ve added into the shooting workflow. It’s not entirely clear what causes this, other than the fact that these high resolutions have a high data rate and sometimes the header doesn’t get written to file. Luckily the video data can be repaired when transferred to a computer. I’ve yet to have a case where the video data is lost.
  • The cameras are not up to par when it comes to long duration shots. A continuous shot of 30 minutes would make me nervous. I had one occasion where a camera froze; meaning the firmware froze up and the battery had to be removed to reset it. And you’ll know it’s frozen, as no buttons will function. Either it won’t power on, won’t begin recording, or stops recording mid-shot. So as a precaution, I am always checking for the red blinking recording lights at the beginning and ending of a shot. Upon freezing, the red recording lights stop blinking and then it is no longer recording video. But any footage prior to the freeze is retained (though it might be corrupted).
  • If the footage count on the front camera display isn’t identical between all of the camera’s, then that camera wasn’t recording during the shot and so that channel will be missing for stitching. Either one camera didn’t trigger via the wifi remote or the camera firmware has froze. So again, always check for the red blinking recording lights at the beginning and ending of a shot. It’s obviously wise to keep notes in your shot log of these mishaps.
  • Even with all that said, they perform pretty well. For example, out of 57 total shots: 4 shots had corrupted files which were all later repaired in full. 1 essential camera had a battery die and lost half the shot. Twice a camera wouldn’t sync with the remote, but it was a non-essential camera facing the tripod and recorded fine without it (it needed a battery reset). And once an essential camera froze up after I had started recording and the shot is unusable. So obviously, get multiple takes of each shot!

Shooting for a dome means that I don’t actually need the complete 360° of footage. So I use a tripod to increase stabilization. Some people use a monopod that has tiny tripod legs. That is useful to capture as much FOV as possible and actually opens up the possibility of complete 360° with some trickery in post-production to hide the monopod itself. But in moving shots, a tripod is required.

Keep a shot log while shooting on location! It will be invaluable since you will undoubtedly forget the shot locations, problems, noted bumps/wobbles, bad takes, reminders, and such. Be diligent about this, as having a shot log will make your importing process much smoother.

Each of the cameras are locked to auto-expose. And I mean permanently locked off from user control. It’s just the way GoPros have always been. But luckily AVP is extremely powerful in its exposure/color correction. Yet it can be difficult to deal with when you’re very close to a bright light source and only one camera has a wildly different exposure value. Even still, AVP can correct between pretty serious exposure differences since the GoPro cameras have excellent latitude.

An exciting option is to shoot in slow motion at 240 FPS. Using this FPS will reduce your end output to 1k fisheye (2x1k spherical). Or shoot at 120 FPS and output to 2k fisheye (4x2k spherical).

The bottom of the 360Rize aluminum mount has an 3/8″ screw hole. (Contrary to the typical 1/4″.) So you will need to get a special quick release plate to put it on any tripod.

Battery Runtime Test
— Official battery, using wifi remote, recorded 2.7k footage for 1h 32m
— Wasabi battery, using wifi remote, recorded 2.7k footage for 1h 19m

Import & Check the Footage

Wherever you shoot, you’re going to need a laptop and external hard drive to dump the video data. For each hour of video taken, you will need about 340GB of free space.

Using an external USB 3 microSD memory card reader is an absolute requirement. Since each card can hold 32GB, you’ll want to be able to dump the data quickly. So dumping the files straight from the GoPro cameras isn’t an option since the USB 2 speeds are too slow. Even with USB 3 speeds, it still takes about 1h 30m to dump all the micoSD cards to disk.

Since each camera creates footage with similar filenames, it’s imperative that the you rename the videos files as they are copied to your computer. Otherwise the possibility of overwriting data is quite possible. But renaming by hand isn’t needed if you use the 360Rize File Data Manager to easily dump the video files for you. I underestimated the usefulness of this command line tool. It automates the process of appending the camera number and project name to the video filename, and then it dumps the footage into a folder. Here are the quick instructions of its use. Don’t worry, although it doesn’t have a GUI, all you need to know is how to CD via the command line prompt.

Since writing this article, there is now software which automates the process of copying the video files from multiple SD cards and organizing them into folders for each take: Action Cam Importer & 360CamMan

Finally it’s time to see how the footage actually turned out. Hopefully you have kept a shot log to help inform this process. But I always create a spreadsheet and keep track of the following per shot:

  • Concatenate needed? Due to the FAT32 memory card format, any shot that reaches 3.8GB will be automatically split into separate video files. This tool makes it MUCH easier: iFFmpeg (Mac), myFFmpeg (Windows)
  • Corrupted video file? Infrequently a video file will corrupt; meaning the header won’t write to file. Luckily the video footage can be repaired.
  • Missing channels? Rarely a GoPro camera will freeze up. Either it won’t start recording or freeze up during a shot. Hence the need to check for the blinking record light in the beginning and ending of a shot.
  • Shot length match between cameras? If a camera battery dies in the middle of a shot, then you’ll still get the footage up to a certain point for that camera. But obviously the rest of the cameras will continue recording.
  • Shots organized? Each collection of 10 related video files will need to be placed into its own file folder to best keep individual shots organized for stitching. So from the huge list of dumped files you must figure out which cameras are related to one particular shot. This is definitely tedious and especially confusing with files that need to be concatenated. My shortcut is to organize the files by Date Modified and have the Length attribute visible.

Stitch & Render

Finally you have all your shots organized and ready to stitch in AVP. What’s amazing is that you don’t need a AVP template for the specific camera rig layout. Just drop all the videos into into AVP, synchronize, and stitch. It uses the overlapping content between each video to automatically create a 360° projection. It’s not always perfect, but it’s damn good most of the time.

Here are a few sources that were helpful in learning Autopano Video.
Autopano Video Wiki Documentation
Autopano Video 101
Autopano Video Forums
360Rize FAQ

Final Render Resolution
AVP can render whatever type of projection you need to a frame sequence. Full resolution examples below. (These are not polished renders, just the initial automatic AVP stitch.)
— Spherical: 8192x4096px
— Fisheye (FOV 180°): 4096x4096px
— Fisheye (FOV 220°): 5000x5000px

I was happily surprised with the fact that upon increasing the FOV to 220°, then the final render resolution increases to 5k. This is possible because more footage is viewable at a higher FOV, and so you are effectively “squishing” more pixels into the fisheye image. So when I project it onto the dome at 4k, then I have the added benefit of increased sharpness since the video has been downscaled.

I render using the spherical projection because often I want to add slow movement to the virtual camera in After Effects with the Navegar Fulldome Plugin.

The only difference between Autopano Video and Autopano Video Pro is the GPU render option. Which makes a huge difference in render times. When rendering at these high resolutions, you need a graphics card that has lots of memory. So the GeForce GTX 770 4GB has a decent amount of cores, lots of RAM, and at a good price point.

Final Render Statistics at Full Resolution (Spherical)
— CPU: 0.05 render fps – takes 12 hours to render 1m 30s of video.
— GPU: 1.5 render fps – takes 1 hour to render 1m 30s of video. (GeForce GTX 770)

Right when you get the footage, you’ll just want to see a 1k preview quickly. The good news is that fisheye rendering at 1k in AVP is quite fast at 10fps on the GPU.

As of AVP 1.6, they have implemented a stabilization algorithm to help smooth out bumps. It definitely reduces any bumps by 50% and I read that will be improved in the next few versions. They are also adding a timeline which can keyframe cameras and color; which means that you can interpolate between stitches. That is powerful when trying to fix parallax on a moving dolly shot. It’s also very helpful when trying to fix an unruly camera whose exposure is way off from the others (such as a camera pointing at the ceiling).

When using GoPro cameras, the stitching can be optimized since the camera sensors aren’t perfectly aligned/centered with the lens and therefore each have their own lens distortion model. So you can pass AVP this information in the the Lens Distortion Correction settings. As of version 1.6 the settings have been moved and updated.

The video and render examples showcased in this blog post do not yet have their stitches hand polished in AVP. They only use the initial automatic stitch because I just wanted to see the results quickly. So any obvious seams and parallax errors you see could be fixed with some hand polishing. I’m still learning how to best use the software. But you need to run alot of experiments to get polished results and possibly even composite multiple AVP renders in After Effects to get the most seamless results.

The GoPro cameras do a great job of capturing color and have excellent exposure latitude. But a photo filter in After Effects is often needed to color correct for tungsten lights or daylight. Then the colors need a big saturation boost to look optimal in the dome. And the use of curves can enhance dark colors instead of using contrast. And maybe a slight touch of sharpening.

This AVP explanation could easily be a whole write-up of its own. Perhaps in the future I’ll share my experience once I’ve learned more. I’m currently in the process of stitching 57 shots (900GB total unprocessed) taken at the NASA Goddard Space Flight Center. It was an intense 3 day shoot and we got some amazing 360° shots that I can’t wait for you to see in our upcoming planetarium show production.

Lastly, if you’re shooting with the stereoscopic 360Rize rig, then you will have a slightly different workflow in AVP. I don’t even want to fathom the bizarre difficulties with 3D 360 video.

Both the normal and stereoscopic 360 videos can easily be experienced on the Oculus Rift (using Kolor Eyes, which is free).

And that’s the Gist of it

As you can see it’s not the faint of heart, as it demands a pretty wide range of technical knowledge. But the results can be simply stunning. This is a newborn medium and there is alot of room to grow. But it’s an exciting time for fulldome show producers, hyperdome possibilities, and the Oculus Rift community. 4k video in the dome has long been a dream and I think this is an bold step into a fascinating frontier.

Scuba Diving with a 360 Video Rig
Update: January 19, 2016


I’ve recently been working with Allan Adams (Associate Professor of Physics at MIT) and Keith Ellenbogen (Acclaimed Underwater Photographer) to create a planetarium experience of their underwater scuba dives. They have been using 360 video as a way to immersive audiences in these majestic underwater worlds. So we had been discussing some of the problems they experienced while during a test dive inside the Giant Ocean Tank of the New England Aquarium. There were two main problems: triggering the cameras to record and then the cameras overheating. So I did some testing and came up with a specific shooting workflow for scuba diving with the 360Rize 360Abyss rig. These tips are probably specific to the older plastic core housing version, not the recently released v4 redesign which has an aluminum core housing.

Triggering the Cameras to Record
Preparing the rig for a shoot is not to be rushed. It takes about 20 minutes to install the cameras, screw in the dome doors on the rig, and be ready to go. And since the scuba rig has many small screws, delicate domes, and electronic cameras… Well once you get in the boat, the last thing you really want to do is open up the scuba rig. Just imagine the rocking of the boat, cramp space, busy people, weather and mist and splashing water. So that really limits how you can trigger recording the cameras themselves.

Built into the 360Abyss rig there are physical buttons which allow you press the power button and turn on the cameras even while underwater. And on the GoPro cameras there is the useful ‘QuikCapture’ feature that allows for one-press-of-a-button to power on the camera and automatically start recording. And with the GoPro Hero 3+ you could hit the power button and it would power on the camera and automatically start recording. But with the GoPro Hero 4, the feature was changed to only function by hitting the record button, not the power button. So if you’re using the GoPro Hero 4 then you must completely rely on the wifi remote (since the scuba rig only allows us access to the power button). Luckily the wifi remote is rather reliable, even when the cameras are installed within the scuba rig. Through even if the wifi remote was in a special housing, I doubt that its signal would be powerful enough underwater. So we are really left with one style of shooting: hit record on the boat and then jump in the water.

But first I did a specific test. I started the 6 cameras recording with the wifi remote. Then walked across the building with just the remote, until it was no longer linked through wifi. The cameras continued recording even though the wifi link had broken. So this confirms that you can safely power on the cameras when you reach the diving point, hit record, jump in the water, and you can leave the wifi remote on the boat without worrying.

Overheating Cameras
In the test dives, the cameras were overheating prematurely to the batteries dying. Which makes sense, as the cameras are shooting at very high resolutions (2.7k 4:3, 30fps) and so they generate lots of heat. And seeing as how the GoPro cameras mainly rely on air movement to cool down, well when you put the camera in a small enclosed space… it’ll overheat and shut down. And I wasn’t thrilled of the idea of going down to a smaller resolution because of this issue.

So I investigated the two options of doors that are included with the 360Abyss: plastic and metal doors. I’m guessing that the door selection affects what kind of buoyancy you desire, since the metal doors are clearly heavier. But in reality the metal doors are absolutely vital. In my tests the plastic door definitely acts as an insulator and traps heat in. And the metal door acts as a radiator since the front face of the camera presses up against the metal door and is helping to move heat away. While testing I would periodically touch the door face and the metal door was superior in radiating heat. And underwater the heat from the doors would be cooled much more efficiently. But even without being underwater, and using the metal doors, I could record until the batteries died. Yet if I used the plastic doors then the cameras would overheat.

Scuba Shooting Workflow: Preparing for the Dive
1. On dry land: quickly check that all cameras have matching settings. Install the cameras into the scuba rig. Make absolutely sure that the wifi is on (the blinking blue light). Screw in the glass domes.
2. Jump on the boat and travel to your location on the water.
3. When fully suited up and ready to dive, then use the wifi remote to power on the cameras. It may take 1 or 2 minutes for the remote to report all 6 cameras. When ready, hit the record button. Visually confirm that each camera has the red recording lights blinking. Give the camera rig a few quick twists (for post-production syncing). Leave the remote on the boat.
4. Go scuba diving!


Update: June 5, 2016 – I just stumbled across some excellent documentation of Camera Settings for GoPro 360 VR Rigs. It includes excellent examples of different camera settings and some solid tips.

Update: January 23, 2016 – I recently updated the firmware on each of my GoPro Hero 3+ cameras (x10). And during the process one of the cameras failed and it has since been unusable. It’s officially bricked (not stuck). Upon powering on the camera, the both red LED’s illuminate solid (no flashing) and the display remains blank. From that moment onward no buttons function and the only way to shut it down is by pulling out the battery.

I should note that this specific camera hadn’t been completely reliable when shooting on location. Sometimes it would ignore the wifi remote or it would freeze rarely. It was a camera that we would particularly keep an eye on since threw problems more often than any other. So it seems there were some early warning signs. But it’s still surprising for it to be usable one day and the next utterly unresponsive.

I’ve tried all sorts of tips of how to un-brick a GoPro camera. Suggestions from forums, tutorials, and even GoPro customer service acknowledges that the camera is dead. And it’s past the on year warranty, so I’m out of luck. So I’m sharing this for others to be aware that updating your firmware can be a potentially scary process.

Update: January 12, 2016 – After much 360 shooting and traveling, our PRO10HD rig is starting to get weathered. Recently one of the weak points on the arm clamp has snapped. Yet the clamp is still functional and is able to maintain a secure hold on the camera. But seeing as how we have many more plans for 360 video, we decided to upgrade to the new version of the PRO10HD.

The upgraded PRO10HD features include:
— 3/8″ solid brass threaded inserts (the prior version just had straight 3D printed plastic that could easily be stripped)
— New stronger 3D printed material (hopefully this solves the broken arm clamp issue)
— Stronger holder arms for securing cameras (seems like the arm clamps are now slightly thicker)
— Larger GoPro port access to access the HDMI and USB ports (will be useful for extracting the microSD cards)
— Unit now compatible with HDMI cabling (not useful for me, but essential for those wanting to do live streaming)
— For the two cameras facing down (#9 and #10), they have been given an added tilt. This can easily be seen within the PRO10HD comparison. This makes sense as the cameras now have more coverage and less unnecessary overlap. I have always wondered about this and am happy to see this change implemented.
— Lets say the worst happens and you have a camera completely die while shooting on location. Well the upgraded PRO10HD rig has an added tripod mount that allows you to instead shoot partial 360 video (coverage reduced 360×120° instead of full 360×180°) and only be missing the tripod area. This is possible by removing camera #1 and then using the revealed mount that is typically hidden by that camera. This been a great fail-safe solution to have, just in case. And I’ve even had to fall back on this option once.

Update: December 1, 2015 – Recently I have been contributing to the Making360 open source book. It’s a collection of solutions and illustrations of common problems in producing immersive video experiences. So if you find this blog post helpful, then Making360 will give you a comprehensive understanding.

Update: November 17, 2015 – It’s fascinating to see all the different approaches to 360° video. So I surveyed the current 360° video rigs being offered and then organized every serious option into an epic listing: Collection of 360° Video Rigs. The results are telling…

Update: October 3, 2015 – Disney Research Zurich has been doing some very interesting experiments into optimized stitching of 360 video. (Or read the full paper.) Though I would really like to see their algorithm perform against 360 video with some major parallax errors, including objects within 6 feet of the rig WHILE having the rig moving. Even still, it’s pretty impressive results. Hopefully we can see this implemented into commercial software soon.

Update: June 17, 2015 – Some interesting news! GoPro has acquired Kolor. So 360 video is definitely maturing into something interesting. Perhaps GoPro will eventually release their own 360 video equipment, but that’s my own pure speculation.

Update: March 27, 2015 – Kolor has created a video explaining the complete 360 video workflow. It covers everything from camera setup, shooting, stitching, and rendering. It is an introduction to much of the necessary knowledge for creating 360 videos.

Update: February 19, 2015MewPro is an Arduino Pro Mini board which can reprogram/control almost all of the GoPro3+ functionalities. The Orangkucing Lab has been experimenting with an interesting solution for automatically syncing GoPro cameras. By hooking the MewPro to Gopro Dual Hero System, you can genlock multiple cameras. (Genlocking is a technique where the video output of one source is used to synchronize other video sources together.) So the jello effect might soon be a thing of the past. The jello effect happens when there is a sudden bump or vibration and since the GoPro cameras aren’t strictly hardware synced then each camera is recording a slightly different moment in time. So to simplify it, the FPS isn’t globally locked. And when you later stitch it all together you will see briefly see the seams between the video sources during the moments of bumps/vibration. Hence each of the video sources look like jello. And up till now this hasn’t been addressable.

Update: November 21, 2014 – Looking for a simple description of 360 video and how it works? David Rabkin and I wrote a paper that was presented at the 2014 GLPA conference: video of the presentationpaper (2014). Also here is David’s updated paper (2016) with our latest work. So if you want the basics without getting too technical, then this is your best bet. Full cost assessments: PRO10HD or PRO7.

Update: November 20, 2014 – Here is an interesting roundup of 4k video cameras in the marketplace. So it seems that 360 video is still the best option for creating 4k domemasters, especially since 4k video cameras still don’t have sensors with at least a 4k vertical. So we are still a number of years away from 8k video cameras being mainstream. Yet it’s quite interesting to see 4k become a standard within video camera equipment.

Update: October 30, 2014 – If you are wondering about the new GoPro Hero4 camera… Having each camera shoot at 4k 30fps is amazing, but it comes at a cost. Results from the 360 video community suggest that you can only shoot for 30 minutes before needing to cool down the cameras. And sometimes the Hero4 camera will forcefully shut down due to overheating.

Update: March 12, 2014 – The GoPro cameras have a firmware update available that allows you to shift the automatic exposure through the iPhone app. I’m curious to experiment, though I’m cautious of the added workflow of getting the optimal exposure and updating ALL cameras settings for each shot. Also, while it’s too crazy for me (since it voids your warranty), there are ways to hack a GoPro camera to truly lock down the exposure (AutoexecHack scripts).

83 thoughts on “360° Video Fundamentals

  1. Great tips and thank you for taking the time to post this. I just started using the freedom360 5 cam rig with VideoStitch and PTGui and can vouch for it being a “tedious and nerve racking process”! I am wondering if Autopano is better? Did you compare the two?

    • Its my pleasure to share! I first began using Autopano a few years ago to stitch panorama photos I’d taken with my iPhone. So trying out Autopano Video Pro was a natural thing to experiment with when we got the 360 rig. But I haven’t played with VideoStitch too much. VideoStitch has the preferred timeline keyframes to manually correct for exposure. But the recent beta versions of Autopano Video have added keyframes for exposure, color, and also morphing between stitches to address parallax in moving shots. And now the GoPro camera exposures can be locked down… (new firmware announced a few days ago). Wishing your 360 shooting all the best!

  2. good step by step tutorial… compare to mine, my experience with 360hero wasn’t a good one. i bought h3pro6 in winter last year. then i realized that my 6 gopro 3+ silver edition weren’t a good match with this rig because the silver model wasn’t able to shot at 1080p as the aspect for 1080p was 16:9. i had to downgrade the video quality to 720p in order to obtain a 4:3 aspect. Next, the AVP wasn’t able to stitch the 6 output videos properly as well. The reason seen like the overlapping areas of each video weren’t enough to stitch altogether… I have been email back and forth with 360hero about my issue but they are always too busy to reply my email 😦
    i have been told some 360hero users were experienced issue like mine but i wasn’t able to find anyone discussed it so far.

  3. Just saw the Freedom360 rig demoed at Imersa. (Missed you there this year) Very impressive results, even when compared to Red fisheye rig. Excited to run some of my own test shoots when I can get my hands on a rig.

  4. Here is a question from Reddit.com (user: Drewkungfu) that I’m reposting here for posterity:

    I’m trying to figure out how the USB hub is used in your workflow. I imagine that it is for data transfer from the micro SDcards to a hard drive. But perhaps an array of USB micro SD card reader is missing from your list. Do you dump all 10 simultaneously, or one-by-one as the 360Rize File Data Manager tutorial shows. If transfer 1×1 then what is the point of the Hub? It also lists that the hub is a charger, but isn’t that mitigated by the power squid & wasabi chargers?

    Also is the 360Rize File Data Manager worth using? It appears that it simply renames the files so that projects don’t get confusing. However, is it really so difficult to rename/drag/drop into a folder?


    Good questions!

    The USB hub is simply for charging all the GoPro official batteries (in camera) at once from an outlet. Its not for data transfer at all. Using an array of USB 3.0 readers all at once would slow down transfer speeds too much. I dump the footage one MicroSD card at a time.

    I was disappointed to find that the official GoPro batteries won’t fit into the Wasabi chargers. There is a notch that prevents it. The Wasabi chargers can only charge Wasabi batteries. I got the Powersquids to ensure that I use all the Wasabi chargers without hassle. So the GoPro official batteries always charge in camera via the iPhone USB power adapters.

    Yes I’d say the 360Rize File Data Manager is absolutely essential. Its not really about difficulty, but about using your time wisely. You will be dumping 12 cameras footage, so the possibility that footage will have the same filename is high. Sure you could rename each file by hand… But look at it this way: if you have taken 15 shots, with 10 cameras, then that’s 150 footage files total. And I’d rather not introduce any sort of human error into footage filenames just from dumping the data. Or worse yet, overwrite and lose footage channels. So that’s my soapbox. Choose at your own peril! 😛

    Also, to get USB 3 data transfer speeds, you’ll need to plug the MicroSD USB 3.0 reader (included with the Lexar MicroSD card ) into a blue USB slot on your computer. If your computer doesn’t have a blue USB slot, then you’ll have to install a new PCI card enabling this.

  5. Great presentation and well done! Send me an email and I’ll get you a free version of the 360CamMan. 360CamMan is a new data file management software that is a huge time saver in work flow and file management.

  6. No, unfortunatly exposure lock doesn’t lock exposure in video mode, it is exposure shift! So you can darken or lighten your video, but still automatic! BUT. Some clever guy found a script, which really lock the exposure, search chernowii user!

  7. Also, syncing with audio is not accurate at all. Sometimes my gopros are 2 frames delayed sometimes 3 frames, when syncing with audio. Forget is, use the motion method!

  8. We decide to buy scarlet-x dragon with sunnex lens that can make dome master 2.7K and we need your opinion which is best i know the hero rig can output over 4k but in the other side the stitch problem and hard workflow and if scarlet worth its price ,we tested your hero images on our planetarium and tested images from scarlet x without dragon and found that the hero images is a bit higher quality in some footage not all and we could not found any footage for scarlet-x dragon to test on dome please help us to make a right decision

    • Well I haven’t shot with a Red camera before, but I have used a Nikon with a fisheye for timelapse. So the concept of shooting with a fisheye is something I can relate to.

      For me, capturing just 180° of footage can be frustrating. Especially since I have a concentric dome and the horizon is not tilted. When shooting with a fisheye lens, you must be quite delicate in choosing the correct angle for the shot, since it can’t be adjusted in post-production. I’m convinced that its often helpful for the viewer to see 220° of footage on the dome and therefore see much more of the immediate ground around the camera. It improves dome immersion.

      But I understand your hesitancy to deal with the weird and tedious 360Rize workflow. In my opinion, the sometimes visible stitching seams is the only negative aspect of the 360Rize rig. But if you are careful in planning shots then these can be minimized.

      Shooting with the Red camera rig is going to be EXPENSIVE. Just the brain: Dragon $14,500 / Scarlet-X $10,150. That doesn’t include anything else other than the camera sensor body. Its a high-end professional camera rig, so you’ll need to know how to properly expose.

      I’m a little confused that you’re not consistently seeing a dome quality difference between the 360Rize 5k render and the Red 2.7k. Image details will definitely not be as sharp. Sometimes it difficult to see with still images, but the moving video it makes all the difference.

      But if detail quality is of less importance and you just want really quick’n’easy video on the domme, then there is another option. Dome3D has been custom installing a fisheye lens onto a GoPro camera. The end domemaster resolution is 1.5k at 30fps. http://www.dome3d.com/products/fulldome-video-cameras

  9. Thanks Jason for your work, its very useful. May I have two questions about APV from Kolor. I’m unable to reduce the angle in height, I have always all the sphere with the big black marks. How can I do that ? And for the zoom, is it possible to reduce it ? Thanks a lot if you can solve my problem because nobody can do it on the color forum. Best wishes for your productions…

    • Good to hear the article has been useful. Thanks.

      I don’t have enough information to answer your questions and I need to know more to be of any help.
      — What type of 360 video rig are you using? (Example: I use the 360Rize H3pro10HD)
      — What projection type are you outputting to? (Example: Spherical or Fisheye)
      — Can I see an example render/screenshot of the “sphere with big black marks”?
      — I don’t quite understand what you mean that you’re “unable to reduce the angle in height”. Are you trying to adjust the placement of a single video source? Or do you mean adjusting the overall fisheye FOV?
      — Again, I’m not quite sure what you mean. By zoom, do you mean enlarge a single video source? ‘Zoom’ is a confusing word in 360 video.

  10. Pingback: Fulldome Opportunities and Resources for Artists | Art & Emerging Technology

  11. Hello
    You have mentioned
    Camera Settings for Best Resolution
    – Resolution: 2704 × 1524

    But 360Rize proposal is 1440 48FPS

    Will you have suggested it??

    • Well there are differing perspectives and needs here. Do you need higher end resolution or 48fps? 360Rize suggests 1440 48fps because that is the highest resolution option that still offers 48fps. And 48fps can improve any time offset between the cameras since there are more available frames per second to stitch together. But I’m filming for a planetarium and my dome has a resolution of 4096×4096. So the end resolution is very important to me and I instead shoot at 2.7k 30fps.

  12. Awesome post man, very useful! I’ve been trying to use the H3Pro10HD rig myself, however I’ve been having some issues with the stitching process. I’ve tried setting the GoPros at 1080p @ 30fps, but when I tried stitching the videos, AVP kinda just overlapped all the files in the same place. Have you experienced anything similar?

  13. Great post! It was extremely re-assuring for me. I’ve been working with the H3Pro6 and AVP and found all the same issues and obstacles you recall above. Lots of great tips in there that will help on future shoots. The real big difference is I’m doing full 360 versus the dome, so camera movement is a larger challenge (without seeing a person or device).

    Also we’re hoping to implement the footage into Oculus and Samsung Gear. As well as attach the 360 mount to a drone for aerial footage.

    • I’m happy to hear it was helpful. The frontier is wide open and there is much experimenting to be done! Indeed shooting 360 video for VR is much more challenging for camera movement and keeping the operator invisible. Wishing you all the best!

  14. Hi Jason
    Very interesting and helpful documentation, thanks a lot. Have you got some experience to merge some text or graphics into a 360video (to resolve the distortion problem)? I will be thankfull for any help.

    • Good to hear its been helpful! There are a few different techniques for adding text/graphics into a 360 video.

      1. The easiest method is to add a patch directly in Autopano Video. This is mostly useful for covering up the tripod or for adding a logo into the 360 video. http://www.kolor.com/wiki-en/action/view/Patching_the_nadir?action=view&title=Patching_the_nadir

      2. But if you want to actually composite different text/graphics together and still account for the spherical projection, then there are two options:
      — Nuke (The Foundry) – use the ‘Spherical Transform Node’
      — After Effects (Adobe) – Navegar Fulldome Plugin

      3. Or you you want to motion track the shot and add in vfx elements into the 360 video, then you can use Maya or 3DS Max. Then install the Domemaster3D toolset and use the LatLong lens shader to render spherical shots. But getting your footage tracked and then matchmoved in the Maya scene is going to be very tricky since software doesn’t yet exist to track 360 video. But if the camera work is simple enough, then you can do it by hand… I’ve done it with satisfying results. https://github.com/zicher3d-org/domemaster-stereo-shader

  15. Hi Jason,

    Really good write up but now its been almost an year since you wrote, You think you would be able to advice me on what should be my best bet to shoot 360 degree videos for youtube consumption. I want to shoot historic monuments of world heritage sites…

    Best Regards,

  16. Hey jason

    I want to start shooting 360 video can you suggest me which rig is good freedom 360 or 360Rize and which model H3PRO10HD or H3PRO6 ? Which stitching software is good amongst three Which is available now ??

    Thx in advance

    • Chances are that you don’t need to shoot with the H3PRO10HD. I selected it only because I am shooting for a planetarium and needed the largest possible stitched resolution that I could get my hands on (minimum 8192×4096 px). But that is now possible with any 6 camera rig since the GoPro4 can shoot 4k 30fps; which hadn’t yet been released when I bought the rig. So if you are doing 360 video for the web or VR, then the H3PRO6 should work just fine. And you probably only need a stitched resolution between 2048×1024 or 4096×2048, which just a best guess based on current VR/web playback tech. Plus dealing with 6 cameras instead of 10 makes your life easier and has less chance of things going wrong.

      Both the Freedom360 and 360Rize make well built and solid rigs. I haven’t used the Freedom360 so I can’t really suggest which you go with. The Freedom360 takes a little but more time to install the cameras since they need to be screwed in. But I think that it also has slightly less parallax error since the cameras are quite literally as close as they can get. Yet I’m not sure its all that much of a parallax error difference from the 360Rize H3PRO6. Really both camera rigs will do the job wonderfully.

      And again, I’ve only used Kolor Autopano Video for stitching. So I can’t compared my experience against anything else. I’m pleased with its 360 video stitching results and its a great toolset to get the best possible quality. I’ve heard good things about VideoStitch but I’m not totally hot about their GUI, and I think that PTgui can be confusing to use. I’ve used PTgui for converting between projections of star photos and its not intuitive.

      Wishing you all the best!

  17. Hello Jason

    Thx for Reply..
    I Have read some where that H3PRO10HD rig allows your subjects to get as close as 4 feet to the cameras and also tightly stitched compare to H3PRO6… I have seen lot of video but all are not stitch well… I don’t understand what was the reason?

    2) Do you think Horizontal placed gopros rig is better than H3PRO6

    3) What is your take on recent launch rigs which one you consider to buy this days ?


    • 1) I would say that with the H3PRO10HD rig you should keep about 7 feet away, and more like 10 feet to avoid parallax errors and not have to do some major work upon stitching. The 6 camera rigs actually tend to stitch better since there is less parallax error. And also there is more footage overlap since they shoot with a 4:3 aspect ratio, instead of the 10 camera rig requiring a 16:9 aspect ratio. But I think alot of people shooting 360video on the 6 camera rigs are getting subjects way too close to the rig and making parallax errors very apparent. I just don’t think they are thinking critically about the technology and its limitations…

      2) Interesting question on horizontally places cameras, and a matter of opinion. But for me, yes I feel they are a better choice because having most of the cameras looking straight at the horizon to help and hide seams. Well not so much hide but to trick the viewers eye since there is often so many vertical lines in shots, and the seams can more easily disappear. But with 6 cameras rigs, the stitch lines are always diagonal and often are more obvious. Luckily you could always go with the H3PRO7 to achieve this. I was originally choosing between the H3PRO7 and the H3PRO10HD; and I ended up going with the H3PRO10HD only because of my stitched resolution requirements.

      3) I’ve yet to see a recent kickstarter rig or new 360 rig that can compete with the 360Rize or Freedom360 resolution and quality. But the recently Kickstarted “Sphericam” is pretty impressive and I’ll be curious to see if it stands up to what they are quoting for stats currently: 4096×2048 px, 60fps, raw video, low parallax, synchronized exposure, stitched in camera. The creator has an excellent track record with a previous related project. We shall see! https://www.kickstarter.com/projects/1996234044/sphericam-2-the-4k-360o-video-camera-for-virtual-r/description

  18. Hello Jason

    Thank you very much for your Reply

    2) One more thing as u mention H3PRO7 rig but don’t you think that i will get only 360 by 120 degree through this Rig but if i buy H3PRO10HD than because of one gopro up and two down so i will get full 360 by 180 degree.

    3) Have you shoot and Spherical projection video (Link Please)

    • 2) As listed on the product page, the H3PRO7 captures a full spherical view (360×180). Its not a problem to have only one GoPro facing up. The H3PRO10HD has only one GoPro facing up too…

      3) Yes I’ve rendered to spherical projection video (instead of fisheye) so that I could experience it within an Oculus Rift. But I added a logo patch to the bottom to cover up the tripod, cart, or dolly in each of the shots. http://videos.360heros.com/website/users/webplayer/videodetails.php?vid=848

      • I got it almost but can you tell me H3PRO10HD rig has three gopro Extra, one is up and two down compare to H3PRO7 just to make 360 degree video planetarium dome or just to increase stitched resolution requirements because The time you purchased gopro 4 is not available ?

      • No problem. Indeed it is confusing. When I bought the H3PRO10HD, the GoPro4 didn’t exist yet. But planetarium domes are very hungry for resolution and sharpness, so the H3PRO10HD is still the best option for me. What are you hoping to use your 360 video for?

        Both the H3PRO10HD and the H3PRO7 capture the full 360 x 180 field of view. The only difference is the stitched resolution. http://www.360rize.com/wp-content/uploads/2014/05/Product-grid.jpg

        — 10 cameras
        — Uses 16:9 aspect ratio

        — 7 cameras
        — Uses 4:3 aspect ratio

        So the H3PRO10HD can use the 4k setting since it uses the 16:9 aspect ratio. But the H3PRO7 can only go up to the 2.7k setting since that is the best offering with 4:3 aspect ratio.

        GoPro 4 Black
        — 4k / 30fps / 16:9
        — 2.7k / 60fps / 16:9
        — 2.7k / 30fps / 4:3

        To see all the details, check out page 26 of the GoPro4 Black manual:

        Click to access UM_H4Black_ENG_REVD_WEB.pdf

    • Well I can’t make a recommendation for you because I don’t know what you’re using the 360 video for. Context is everything with this tech since different rigs are optimal for specific needs. For me doing planetarium work, resolution is the top concern. But for most people doing 360 video for VR/web, then there are different considerations: parallax error, stereoscopic, realtime stitching, and future-proofing your work. I feel that the H3PRO7 is a solid and true tested rig. But while the Z4XB seems a bit more experimental, it is fascinating and probably the way that future 360 rigs will go. The Z4XB is quite new, seeings as how it was launched on 2014-09-30.

      Though the minimal parallax errors do look quite good, it’s still apparent. Check out this example and make sure to turn the YouTube quality to 1440p when watching fullscreen: https://www.youtube.com/watch?v=O7sF4BUr-bE

      Because the lenses are so small and well priced for all 4 in the kit, its hard to say what the lens quality is like. But seeing as how the total price is $1300 for the whole kit, that amounts to somewhere around $250 per lens (x4). So if the price matches the build quality, then it should have decent sharpness. Yet I wish the creator would post the raw un-stitched fisheye footage and then it would be very interesting to check out.

      Also to note, $1300 gets you the four 185 fisheye lenses, Z4X mount, and rugged carrying case. But it does not include the GoPro 4 cameras. Just something to be aware of.

      You should contact the creator and see if he will post onto YouTube some unprocessed 4k footage taken with one of the fisheye GoPro4 cameras. It would be easier for you to make a decision then.

      FYI: typically a cheap fisheye lenses will not have sharp details although its fully in focus. And the edges of the fisheye image will be even blurrier. But this all depends on the quality of the lens glass. Also with most fisheye lenses, especially cheap glass, the edges of the fisheye image are tinted blue from lens aberration. So you will lose some of the footage overlap, though not very much. But this would only be an issue in the poles… The north and south poles on this rig could possibly be problem areas. In theory they shoot 185 degrees, but is all of that 185 degrees usable or not? So lets just assume we would lose 3 degrees of useable footage due to lens abberation, and so that gives you 2 degrees of footage overlap in the poles. Which is just barely enough. I think I did that thought experiment correctly. But alas, I have no experience with the rig and I’m just sharing thoughts off the top of my head.

      I haven’t seen the rig prior and its quite interesting. If I didn’t have to shoot for a dome then I’d definitely consider it. It’s a small and efficient rig, nothing to scoff at!

  19. hello Jason

    Thank you for sharing your knowledge with me..

    first of all let you know that I’m doing 360 video for VR/web and I want get result through gopro hero 4 only.. I consider Z4X kit set for Three reason One, minimal parallax errors.. second, seamless 360 video can be taken at just 0.5m away & third, It is horizontally placed… I dont know lens quality..

    Watch https://www.youtube.com/watch?v=JqZSrW_cqIY you will fine i think raw file here https://www.dropbox.com/s/741rqj71tto4hdc/Sony_morpheus_booth_ACG2015.zip?dl=0

    may be this helps

    • Indeed the Z4XB kit definitely makes for easier stitching. I think the Z4XB kit is ideal for VR/web use.

      So I can’t confirm this, but this fisheye lens might require the use of a 4:3 aspect ratio, which would mean that the 2.7k resolution would be your highest option. I believe that if you use a 16:9 aspect ratio then your fisheye image is then cropped. Just something to be aware of.

      I downloaded the dropbox link and it does indeed contain a raw 2.7 MP4 from the 185 gopro lens. And the lens quality is about what I expected. Its not great, but not bad either. The details are fuzzy, but the fisheye image edges are better than I’d expect. Zoom to 100% and see for yourself:

      The parallax errors are small, but its definitely not seamless from .5m (1.64feet) away. The rig looks like it has a similar spacing to human eyes, so parallax will still be visible out to 5 feet or so. Watch the area near his feet for seams: https://www.youtube.com/watch?v=O7sF4BUr-bE

      So the cameras being horizontally placed aren’t as much of a consideration since the cameras have 185 fisheye lenses installed. Sure its kind of a plus, but since there is so much overlap your seam lines will be dictated by the stitching software.

      Also the website vaguely mentions that it has the possibility of “semi stereoscopic 3D360”, though I’m not quite sure what means.

      So! Overall I think the rig is pretty interesting and is worth its cost. Being able to shoot 360 video with only four cameras eases file transfers, parallax errors, and stitching.

  20. Hi Jason,
    I have shot 15 videos using H3Pro7. I am able to stitch them perfectly and the output is convincing. I just don’t know how to edit all the 15 in one video.

    I tried doing that on Final Cut Pro but the output file is playing as static images & not a video. What is it that I should use to edit all the 15 videos as one video?

  21. Is there a microphone you recommend with this rig? We would like to use something that captures (fairly good) surround sound, is small enough to not be seen by the cameras, and relatively inexpensive.

  22. Now you can 3D print a GoPro rig yourself! Free models of the camera rig design available:

    They also wrote up an excellent collection of common mistakes when shooting 360 video. Very good advise.

    And they are also launching a online video course: Cinematic VR Crash Course.

  23. Hello. First thank you for that great tutorial. And i am sorry for my bad english. I wanna ask you just one question. Can i use 360 video like yours to whatch in planetarium.

  24. This is Awesome! Great job putting together the details. Thank you so much. We are planning to use this and create a 360degree video (not VR). Few quick questions:
    #1: During stitching, is there a need to upload the raw files to 360Hero servers? or can it be done locally on our machines?
    #2: What is the output format of the final video, Is it a regular MP4 file with H.264 Codec?
    #3. Does this need any special hosting environment or can it be stored on any regular streaming servers?
    #4: Does it need a special player?
    Thanks again! Appreciate your help!

    • #1: During stitching, is there a need to upload the raw files to 360Hero servers? or can it be done locally on our machines?
      No internet connection or web servers are needed. You can stitch with any computer with the Autopano Video software. After you dump all of the footage from the GoPro cameras, then you use the Autopano Video software to stitch and render all of your footage into a spherical video. The 360Hero web servers are only needed if you want to share your spherical video with others online.

      #2: What is the output format of the final video, Is it a regular MP4 file with H.264 Codec?
      That depends on what you are going to use the final video for. If you are using it for a production (dome or VR) and want to do further color processing, editing different shots together, or such type of further work, then you will need to render to frames to retain the best quality throughout the editing workflow. But if its just one shot that you are sharing directly to the web (such as 360Hero web servers) then you could render straight to MP4 file.

      Also your Projection selection will depend on how you are using the footage. If you are screening your footage on a planetarium dome, then you need to use the Fisheye projection. Or if you are viewing your footage in a VR headset, then you need to use the Spherical projection.

      #3. Does this need any special hosting environment or can it be stored on any regular streaming servers?
      Yes the streaming service needs the ability to warp the spherical footage into an interactive experience. For example here are a few such 360 video web services:

      #4: Does it need a special player?
      If you want to interactively preview the spherical footage on your own computer, without using a streaming web service, then the free Kolor Eyes software easily enables this possibility. http://www.kolor.com/kolor-eyes/

  25. Thanks Jason for sharing the details so quickly. Now, I am starting to understand it better. Our streaming server supports straight MP4 file. Let’s say…after stitching, we produce an MP4 file and host it on our servers. Is it the streaming server which should have the capability and give the 360degree immersive experience, or the player embedded inside a web page. I see some of the players have some kind of navigators on the top-left corner (in case of Youtube), and some have a button on the play bar to change the view (Rectilinear, Fisheye etc.).

    • I’m not sure if the Phantom 3 drone can securely support the weight of most 360 video rigs. But the Ricoh Theta S is definitely light enough…

      Here is a great experiment testing the Phantom 3 power consumption (watts) to estimate flight time (minutes) against the payload (grams).

  26. Great post. Thanks for all the detail. I have a few questions about 360 in general and whether Kolor software is a good solution. I have several events that I would like to shoot in 360. But I’m not sure what I need to make it happen. A lot of the 360 videos I have seen are very distorted (super wide angle or even fisheye). Which for certain projects is great. That’s not what I’m looking for though. The events I need this for need to work with something like Gear VR and I really need the viewer to feel like they are there. Someone sent me this video: http://www.roadtovr.com/first-8k-360-youtube-video-features-striking-dubai-time-lapse/. I don’t need 8k. But I like how the video is not distorted. You can look around without any (or very minimal) distortion/fisheye. What’s the best way to accomplish this (both software and camera set up)? Thanks for any suggestions you might have.

  27. Really great article this, thanks! I had one quick question though in terms of syncing the multiple cameras. I get how motion sync and the wifi remote can be used to sync the start time of the cameras, especially dont you also need some frame-syncing or genlock to ensure the cameras dont then drift out of sync after a few minutes? At the moment I cant see how this is being done and I’d imagine this is critical, especialy for the 3D versions of these rigs?

    • Great question. Indeed the wifi remote is used to roughly start the cameras at the same moment. But then the twist motion sync is used to precisely sync the footage in post-production. But you’re correct, after starting the cameras recording with the wifi remote, then there is nothing that’s keeping the cameras from slowly drifting apart.

      But in practice, I haven’t experienced any noticeable drift between the cameras. And I’ve done shots as long as 30 minutes. Sure I’d bet they drift by a few frames. But you can set your sync point in the Autopano Video timeline, so if you’re using only a select edit from a longer shot, then you can get that moment synced perfectly. But even still, that feature didn’t exist when I stitched a 30 minute shot and I still didn’t notice any drift. Yet my camera movement was very slow, so perhaps it was impossible to notice. Really you are currently constrained by the GoPro battery and how long it can continually record (1h 32m). It was something that I was worried about early on but have since ignored because it’s not a problem. So consider the context of how you’ll be shooting: Long shots? Fast camera motion?

      Another great point you made is that genlock is indeed critical for stereoscopic (3D) camera rigs. I can’t imagine what its like to experience footage where the left and right eyes are seeing footage which has drifted by a few frames. I bet it would cause for some major brain shear. Currently the options for stereoscopic 360 video rigs leave much to be desired. (Example problems: parallax error differences between eyes, exposure differences between eyes, no genlock, getting complete stereo coverage without ignoring the poles, and such big headaches.)

      But there is a hacky solution for getting all the cameras to truly genlock by using the MewPro boards. But ontop of all the problems that GoPro cameras can experience, this is an added overhead I’d rather not worry about.

      Also just recently announced is the Bullet360 control system from 360Rize. But there isn’t any mention of genlock-ing…

      I really enjoyed exploring your website. Your Augmented Electric Guitar was particularly interesting and reminded me of some fascinating Pure Data guitar experiments that you might dig. In my free time I love tinkering with Pure Data to create simple little musical ecosystems. On another note, I’ve also experimented with melody separation through the use of a Python script. Here is one such rendered example (with comparisons). I love making music but so rarely have time to do so.

      • Great, that’s good to hear, thanks Jason. I’ll have fairly static shots, without too much motion, and I can keep shot lengths fairly short also, so that should help. But yeah, the real question is how much of an impact that’ll make on the 3D side of things. I’m guessing the Bullet360 system is intended to address that particular issue. Unfortunately this is for a concert happening in April, so I cant really afford to wait too long and I may just need to take a chance on that stereoscopic 3D rig and see how we get on. I’m working with a 3D video expert who reckons he can get a 2D stream from the rig if the sync issues are too significant. It would be great to hear about any practical experience with this though, such as with the 3DPRO12H for example.
        I like your website too, especially your DIY foot pedal. I’ve made my own footswitchs alright, but not an expression pedal as of yet. Melodic processing and separation is not something I’ve done a lot of work with, but you might be interesting in the work of a friend of mine Ricky Graham, who has done a lot of work with hexaphonic guitars and melodic analysis and generation.

      • I’ve never shot with a stereoscopic 360 video rig before. But in essence it captures a 360 video for both the left and right eyes. So if you have sync issues in post-production, which I doubt, then you could limit it to a single monoscopic 360 video. Yet the rig itself is larger than your typical 360 rig, meaning the cameras are farther apart and so your parallax errors will be more noticeable using the 3DPRO12H rig. Any object within ~10 feet and seams will be visible. On the 360Rize website it says 4.5 feet, but I don’t believe that at all.

        The 360Rize 3DPRO12H will definitely offer an higher resolution after stitching (using 2.7k 4:3 camera settings: stitches to a 8000×4000 spherical, per eye), but there is another option too. The iZugar Z6X3D uses only 6 GoPro cameras and still captures stereoscopic 360 video (using 2.7k 4:3 camera settings: 4000×2000 spherical, per eye). It achieves this by installing a custom fisheye lens to each camera.

        Whatever rig you decide on, you should pull the trigger soon. There are plenty of quirks in shooting 360 video and you’ll want a bunch of time to test the rig. The last thing you want is to run into problems while shooting on assignment in the field!

        You can download a fully functional demo of Autopano Video and test out stitching stereoscopic 360 videos. Below is a vague workflow to follow when stitching stereoscopic 360 video.

      • Thats great Jason, very useful indeed. I was aware of the iZugar and i may consider it, although it is for an out door concert recording so most things of interest (the musicians) will be somewhat distant, so that might require a modification to the lens setup? I’m curious also as to how they manage the sync issue; is it just that as there’s less cameras to stitch, it’s less of a problem?
        Thanks Jason. Believe me I’m feeling the clock ticking on this one, as I’m also writing the music! 🙂

      • Honestly shooting stereoscopic 360 video is a total nightmare currently. In my opinion the many associated problems really take away from the immersive experience. (Example problems: parallax error differences between eyes, exposure differences between eyes, no genlock, and getting complete stereo coverage without ignoring the poles.) Many in the 360 video community agree that the tech isn’t polished yet, and not just because it’s an easier approach to focus on monoscopic 360 video.

        The Z6X3D stereoscopic fisheye rig is interesting though. The sync issue is still a problem, but there are less cameras that could potentially drift from each other. Also by adding custom fisheye lenses, the cameras now have a much higher FOV and so they capture more of the scene with one camera. This allows for more footage overlap and allows for better blending in post-production to happen. Since there are less cameras then there are less seams that the viewer can potentially experience, which is essential.

  28. Hey Jason – thanks for all of these tips! One tip for you — put some Amazon affiliate links on those products at the top. Just ordered nearly $8K of gear following your lead — get that 5% commission! 😉

  29. Hi from Turkey. Great tips, thank you for your posts 🙂

    I have 360Rize “H6” and “Pro10HD” and videostitch software. I stitched and rendered 6gopro (1440 – 80fps) output size : 4K and also 5K

    I stitched 10Gopro (2.7K 4:3 30fps) tried output size : 1024 / 2K / 4K / 8K but unable to render. It says :
    Process is starting.
    Process is now running.
    Process started.
    Only one GPU found, ignoring -d.
    Process crashed.
    Process isn’t running.
    Process didn’t exit normally. Exit code -1073740791

    If I open the batch stitcher and close videostitch app. try to render, batch stitcher shutdown itself. and says: Please send us report to solve the problem..

    My configuration:
    i7 – 4930K CPU @ 3.40Ghz
    32GB Ram
    Geforce GTX 780 GDDR5 3GB ram
    Windows 64bit

    Do I need more GPU RAM? Which graphic card should I buy? Is Gtx Titan 6gb ok for me to render 10cam at 10000×5000? Please help 🙂

  30. Pingback: After the Concert – Enda Bates

  31. I am using the same 10 camera setup from Hero 360 with Gopro hero 4 blacks. Autopano video is saying that is has accurate sync in some of my clips when it clearly does not. Are there currently any ways to solve this issue in the software? this is causing really bad stitch issues for me.

    • I’ve found that the auto sync analysis in Autopano Video to either be spot-on or fail miserably. So I always check the quality of the sync regardless of whether it says it’s an ‘accurate sync’. Most of the time it can get it within a few frames and then some manual intervention is required, bumping the frames up or down for specific channels. But there have been instances where the auto sync has been completely unreliable, and in these cases a different approach is required. I suggest loading all of the footage channels into Adobe Premiere and manually sync the footage timelines by using the audio waveforms. Then write down the frame offsets for each channel and plug them back into Autopano Video. It can be tricky but it’s well worth the trouble. Good luck!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s