Close

Search

Search Menu
Photonics Media Photonics Marketplace Photonics Spectra BioPhotonics EuroPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook

Google to Present End-to-End System for Immersive Light Field Video

Facebook Twitter LinkedIn Email Comments
MOUNTAIN VIEW, Calif., June 22, 2020 — Google will demonstrate a system for capturing, reconstructing, compressing, and rendering high-quality immersive light field video, at bandwidths low enough to be streamed over regular Wi-Fi, at the SIGGRAPH 2020 virtual conference. In the new system, wide field-of-view scenes can be recorded and played back with the ability to move around within the video after it has been captured. The system is able to capture challenging content such as reflective surfaces. 

Google will demonstrate a practical system for light field video for immersive reality at SIGGRAPH 2020. Courtesy of Sara Diamond.

Google will demonstrate a practical system for light field video for immersive reality at SIGGRAPH 2020. Courtesy of Sara Diamond.

To record immersive light fields, the researchers use a custom, low-cost array of 46 time-synchronized cameras mounted to a lightweight acrylic dome. From this data they produce 6DOF volumetric videos with a wide 80-cm viewing baseline, 10 pixels per degree angular resolution, and a wide field of view, at 30-fps video frame rates.

Even though the cameras are placed 18 cm apart on average, the system can reconstruct objects as close as 20 cm to the camera rig. DeepView, a machine learning algorithm developed by the researchers, is used to combine the video streams from each camera into a single 3D representation of the scene being recorded.

For this system, the researchers replaced DeepView’s underlying multiplane image (MPI) scene representation with a collection of spherical shells that are better suited for representing panoramic light field content. The data is further processed to reduce the large number of shell layers to a small, fixed number of RGBA+depth layers without significant loss in visual quality.

The system’s layered mesh representation — a series of concentric layers with semitransparent textures that are rendered from back to front — brings the scene vividly and realistically to life and resolves the issue of synthesizing viewpoints that were never captured by the cameras in the first place. This will enable users to experience a natural range of head movement as they explore light field video content.

The resulting RGB, alpha, and depth channels in these layers are then compressed using conventional texture atlasing and video compression techniques. The final, compressed representation is lightweight and can be rendered on mobile VR/AR platforms or in a web browser. “Users will be able to stream this light field video content over a typical, fast-speed internet connection,” research scientist Michael Broxton said. “Overcoming this problem opens up this technology to a much wider audience.” 

The emerging field of immersive AR/VR promises to give people an authentic experience in a simulated environment. Light field videos give users a more dynamic virtual environment with panoramic views of scenes that span over 180°. They allow users to look around corners and enjoy a greater sense of depth while in the virtual world. 

“We’re making this technology practical, bringing us closer to delivering a truly immersive experience to more consumer devices,” Broxton said. “Photos and videos play a huge role in our day-to-day experience on mobile devices, and we are hoping that someday immersive light field images and videos will play an equally important role in future AR and VR platforms.”

The research will be presented at the SIGGRAPH 2020 virtual conference. Download a copy of the paper, or view the video of the technology, “Immersive Light Field Video with a Layered Mesh Representation.”

 


Photonics.com
Jun 2020
Research & TechnologyeducationAmericasGoogleAR/VR augmented realityvirtual realityimmersive light field videolight sourcesimagesopticscamerasConsumervideo streaminglayered mesh representation

Comments
back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2020 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, [email protected]

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.