Lytro’s Immerge aims to make virtual reality video more realistic
on
Get link
Facebook
X
Pinterest
Email
Other Apps
By David Cardinal
When light field pioneer Lytro tipped that it was going to
switch gears from still cameras to VR video creation, I expected
something along the lines of Samsung’s Project Beyond. Instead, it has
announced that it is building what is likely the world’s most expensive,
and most sophisticated, end-to-end VR video capture, storage, and
processing system.
Lytro’s new Immerge has the ambitious goal of capturing all
the light rays visible from its camera location — including both their
color and direction — essentially an entire light field entering a
spherical volume somewhat larger than a basketball. This is so much data
that the Immerge comes with its own custom server computer that needs
to be moved on a tether along with the camera rig and can store about an
hour of unprocessed content.
The camera rig itself is designed to be controlled remotely
while mounted on a tripod or dolly. Because it captures light from many
positions, and every angle, Immerge can simulate what a viewer would see
as they look around, and even move around, within a volume of about one
cubic meter — six degrees of freedom. The design provides a
more-realistic viewing experience than other VR capture rigs by allowing
for visual parallax, maintaining stereo perception even after head
movement, and eliminating stitch lines. The system can even adapt
to each user’s personal inter-ocular distance.
Immerge is not for the faint of heart
At
an expected six-figure sales price, Immerge is intended for high-end
production companies, like new Lytro partners Vrse, WEVR, and Felix and
Paul. Vrse CTO, Aaron Koblin, sees Immerge as a way to mix “CG content
with captured content in a convincingly real way that doesn’t cost five
million dollars.” WEVR’s Anthony Batt echoes similar sentiments.
Clearly both companies believe that adding the ability to
move around a bit in live-action video will make it much easier to
create composite VR experiences. Specifically, Lytro’s server tools are
being designed to allow integration with existing 3D content creation
tools like Nuke by projecting the captured Light Field Volume into
virtual space — where it can be combined with computer-generated
elements.
Like previous Lytro efforts,
Immerge content will require a run-time component for full-fidelity
rendering. Lytro describes this as its Light Field video playback
engine. As content is increasingly streamed from the cloud, this may not
be as big a stumbling block as it was for Lytro’s still image cameras,
but it does limit the applications of the new system. The rendered
content will be able to be played on any of the major VR headsets,
including the Oculus Rift, Sony PlayStation VR, Microsoft Hololens, HTC Vive, and others. It can of course be rendered into a traditional stereo VR experience for offline viewing.
Lytro
expects Immerge to be available for purchase or rent in the first
quarter of 2016. Components of the system will include the camera rig,
capture server, extra storage units, portable rendering farm, VFX
plug-ins, an operator, and streaming services. Without a doubt, like
Lytro’s original camera and newer Illum, Immerge breaks new ground, and
will help early adopters create some amazing content. Whether the market
is big enough to help justify the additional $50 million Lytro raised
to fund this effort remains to be seen.
Comments
Post a Comment