• Skip to primary navigation
  • Skip to main content

VRSUN

Hot Virtual Reality News

HOTTEST VR NEWS OF THE DAY

  • Home
  • About us
  • Contact Us

vr video

This Pioneering Fractal Artist is Returning to VR with a New Album Soon

January 19, 2022 From roadtovr

It’s been a while since we last wrote about Julius Horsthuis, a visual effects artist who authored a series of fractal 360 videos for VR which are simply mind-blowing—both now and back in the good ol’ days of the Oculus Rift DK2 when we first experienced them. Now the Dutch artist announced a new album is coming to VR headsets sometime this year that’s slated to throw us head-first back into his contemplative fractal worlds.

Called ‘Recombinaton’, the VR album is set to fractal visuals created in Mandelbulb3D—par for the course with Horsthuis’ pareidolia-inducing creations that recall alien worlds, life, the universe, and everything.

The experience isn’t real-time rendered—that simply wouldn’t be possible given the level of detail. Instead, it’s been pre-rendered and captured in 180-degree, stereoscopic 3D video, which features 4,096 × 4,096 pixels per-eye at 60fps.

Back when Horsthuis announced the project in December, he said it was taking “a hell of a long time to render,” which is no surprise considering the level of detail and overall length. Recombination is set to be his longest VR experiences to date, clocking in at “well over 30 minutes.”

But what does it all mean? Horsthuis says Recombination explores “various themes in Physics, Math and Biology.” We’ll just have to wait find out more than that, it seems.

Horsthuis published his first fractal video back in December 2013, however we’re more familiar with Foreign Nature from 2015, which was his first 360 fractal video made especially for VR headsets. Since then, Horsthuis has published nine VR shorts, ranging from five to ten minutes.

There’s no firm release date yet on Recombination—just “2022”. In the meantime we’ll be keeping eyes glued on the artists YouTube channel and Twitter.

Filed Under: fractal art, Julius Horsthuis, mandelbulb3d, News, recombination vr, VR Art, vr fractal art, vr video

Canon Introduces 180° Stereoscopic Lens to Support a “bright future for VR content creation”

October 6, 2021 From roadtovr

Canon, one of the world’s leading camera makers, today introduced a new dual-optic lens which captures 180° stereoscopic views through a single sensor on the company’s high-end EOS R5 camera.

Canon today announced what it calls the EOS VR System which includes its new dual-optic camera lens, new firmware for its EOS R5 camera to support immersive capture, and new software for handling post-processing.

The new RF5.2mm F2.8 L Dual Fisheye lens is interesting because it captures both views onto the single image sensor in the Canon EOS R5 camera. Although this divides the resolution (because both views are captured in the same frame), it also stands to simplify the process of capturing 180° imagery because both views will necessarily have matching time sync, alignment, color, calibration, and focus.

Image courtesy Canon

If any of these factors aren’t matched they can have a negative impact on the vieweing experience because it’s uncomfortable for the eyes to reconcile the discrepancies between each view. Capturing this way also means that the output is a single file for both eyes, which can streamline post-production compared to cameras which capture each eye’s view in a separate file (or many views which need to be stitched together).

Image courtesy Canon

The lens has an aperture of f/2.8 to f/16 and can be focused as close as 8-inches. The distance between the lenses is fixed at 60mm to be close to the typical human IPD. The company plans to update its Canon Connect and EOS Utility programs to offer a remote live-view through the lens for monitoring and shooting at a distance. Canon says the lens will be available in late December and priced at $2,000.

Around that time the company will also release two pieces of subscription-based software, an EOS VR Utility and EOS VR plug-in for Adobe Premiere Pro.

The EOS VR Utility will be able to convert the captured files from dual-fisheye to an equirectangular projection (which is supported by most immersive video players), as well as make “quick edits” and choose the resolution and file format before exporting.

The EOS VR plug-in for Premiere Pro will enable equirectangular conversion right inside of Premiere and allow the footage to be easily managed within other Adobe Creative Cloud apps.

The company has yet to announce pricing for either utility.

Image courtesy Canon

Canon calls the new lens “an important milestone in our company’s rich history as a lens manufacturer,” and says it “welcomes a bright future for VR content creation.”

“This new RF lens produces a stunning 8K virtual reality image and sets itself apart through its simplified workflow. Our goal is to make immersive storytelling more accessible for all,” says Tatsuro “Tony” Kano, EVP and GM of Canon Imaging Technologies & Communications Group.

Live-action immersive video was thought by many to be the next-generation of filmmaking to in the early days of modern VR, but it hasn’t seen nearly as much traction as pre-rendered CGI or real-time rendered content. Complicated immersive camera systems surely didn’t help, and to that end, Canon hopes its new lens and software tools can make a difference.

However, most live-action immersive video also lacks volumetric capture, which means the view can rotate (3DOF) but can’t also move through 3D space (6DOF), which tends to be less comfortable and immersive than VR content which can. Several companies have been working toward volumetric live-action capture, but several key players—like Lytro and NextVR—ultimately didn’t survive and were sold off before finding a market fit.

Whether or not simplified capture and production pipelines are enough to reboot 3DOF live-action immersive content remains to be seen.

In addition to its new lens, Canon has also experimented with XR headsets, most recently the MREAL S1 which it showed off earlier this year.

Filed Under: 180 Video, 360 Video, Canon, canon 180 lens, canon dual fisheye, canon dual optic, canon vr lens, News, VR Camera, vr video

Stunning View Synthesis Algorithm Could Have Huge Implications for VR Capture

August 19, 2021 From roadtovr

As far as live-action VR video is concerned, volumetric video is the gold standard for immersion. And for static scene capture, the same holds true for photogrammetry. But both methods have limitations that detract from realism, especially when it comes to ‘view-dependent’ effects like specular highlights and lensing through translucent objects. Research from Thailand’s Vidyasirimedhi Institute of Science and Technology shows a stunning view synthesis algorithm that significantly boosts realism by handling such lighting effects accurately.

Researchers from the Vidyasirimedhi Institute of Science and Technology in Rayong Thailand published work earlier this year on a real-time view synthesis algorithm called NeX. It’s goal is to use just a handful of input images from a scene to synthesize new frames that realistically portray the scene from arbitrary points between the real images.

Researchers Suttisak Wizadwongsa, Pakkapon Phongthawee, Jiraphon Yenphraphai, and Supasorn Suwajanakorn write that the work builds on top of a technique called multiplane image (MPI). Compared to prior methods, they say their approach better models view-dependent effectis (like specular highlights) and creates sharper synthesized imagery.

On top of those improvements, the team has highly optimized the system, allowing it to run easily at 60Hz—a claimed 1000x improvement over the previous state of the art. And I have to say, the results are stunning.

Though not yet highly optimized for the use-case, the researchers have already tested the system using a VR headset with stereo-depth and full 6DOF movement.

The researchers conclude:

Our representation is effective in capturing and reproducing complex view-dependent effects and efficient to compute on standard graphics hardware, thus allowing real-time rendering. Extensive studies on public datasets and our more challenging dataset demonstrate state-of-art quality of our approach. We believe neural basis expansion can be applied to the general problem of light-field factorization and enable efficient rendering for other scene representations not limited to MPI. Our insight that some reflectance parameters and high-frequency texture can be optimized explicitly can also help recovering fine detail, a challenge faced by existing implicit neural representations.

You can find the full paper at the NeX project website, which includes demos you can try for yourself right in the browser. There’s also WebVR-based demos that work with PC VR headsets if you’re using Firefox, but unfortunately don’t work with Quest’s browser.

Notice the reflections in the wood and the complex highlights in the pitcher’s handle! View-dependent details like these are very difficult for existing volumetric and photogrammetric capture methods.

Volumetric video capture that I’ve seen in VR usually gets very confused about these sort of view-dependent effects, often having trouble determining the appropriate stereo depth for specular highlights.

Photogrammetry, or ‘scene scanning’ approaches, typically ‘bake’ the scene’s lighting into textures, which often makes translucent objects look like cardboard (since the lighting highlights don’t move correctly as you view the object at different angles).

The NeX view synthesis research could significantly improve the realism of volumetric capture and playback in VR going forward.

Filed Under: Jiraphon Yenphraphai, light field, News, Pakkapon Phongthawee, Supasorn Suwajanakorn, Suttisak Wizadwongsa, view synthesis, vistec, Volumetric Video, VR Research, vr video, VR Video Capture

Copyright © 2022 GenVR, Inc.
  • Home
  • About us
  • Contact Us