We announced Jump in 2015 to simplify VR video production from capture to playback. High-quality VR cameras make capture easier, and Jump Assembler makes automated stitching quicker, more accessible and affordable for VR creators. Using sophisticated computer vision algorithms and the computing power of Google’s data centers, Jump Assembler creates clean, realistic image stitching resulting in immersive 3D 360 video.
Stitching, then and now
Today, we’re introducing an option in Jump Assembler to use a new, high-quality stitching algorithm based on multi-view stereo. This algorithm produces the same seamless 3D panoramas as our standard algorithm (which will continue to be available), but it leaves fewer artifacts in scenes with complex layers and repeated patterns. It also produces depth maps with much cleaner object boundaries which is useful for VFX.
Let’s first take a look at how our standard algorithm works. It’s based on the concept of optical flow, which matches pixels in one image to those in another. When matched, you can tell how pixels “moved” or “flowed” from one image to the next. And once every pixel is matched, you can interpolate the in-between views by shifting the pixels part of the way. This means that you can “fill in the gaps” between the cameras on the rig, so that, when stitched together, the result is a seamless, coherent 360° panorama.
Website: LINK