A decade ago, I followed the emergence of affordable 3D televisions at the Consumer Electronics Show for experiencing movies at home. One of the problems of adoption of the technology was the lack of media that mainstream audiences could view on the devices. If you were to have one of these advanced TVs, there was a lack of viewable content streamed or sold to those same households. It was just too much work to pair the content with the device. Filming in stereoscope is a complex process that isn't well supported by the commercial media channels to the home as it is for the cinema.
While stereoscopic headsets are being released in significant volumes following the wave of VR as a mainstream consumer experience, the content availability challenge still looms. (IDC projects 30 million headsets a year to be released by 2026 across multiple vendors. Meta claims 15 Million sold to date.) This time the gaming sector is leading the charge in new media creation with 3D virtual environment simulation using world building software platforms distributed by Unity & Epic Games. The video gaming industry dwarfs the scale of cinema in terms of media spend with US gaming sector alone totaling over $60 Billion annually in contrast to cinema at $37 Billion. So this time around the 3D media story may be different. With lower production cost of using software media creation and a higher per-customer revenue stream of game sales, there will be more options than I had with my 3D Vizio TV.
I recently discovered the artistry of Luke Ross, an engineer who is bringing realistic 3D depth perception to legacy video games originally rendered in 2D. His technique is currently applied to allow 3 dimensional "parallax" depth to a 2D scene by having the computer render parallel images of the scene, depicted to each eye in a head-mounted-display sequentially. Leveraging the way that our brains perceive depth in the real world, his technique persuades us that typically flat-perspective scenes actually are deep landscapes, receding into the distance. Filming the recent Disney series The Mandalorian was conducted using the same world building programs used to make video game simulations of spacious environments. Jon Favreau, the show's director, chose to film in studio using Unreal Engine instead of George Lucas style on-scene filming because it drastically extended the world landscapes he could reproduce on his limited budget. Converting The Mandalorian into Avatar-like 3D rendering for Vizio TVs or VR head mounted displays would still be a huge leap for a studio to make because of the complexity of fusing of simulated and real sets. But when live action goes a step deeper to simulate the actors movements directly into 3D models, such as the approach of Peter Jackson's Lord of the Rings series, rapid rollouts to 2D and 3D markets simultaneously becomes far more feasible using Luke Ross "alternate-eye-rendering" (abbreviated AER).
Stereoscopic cameras have been around for a long time. Capturing parallax perspective and rendering that same two camera input to two display outputs is the relatively straightforward way to achieve 3D media. What is so compelling about the concept of AER is that the technique achieves depth perception through the use of a kind of illusion which occurs in the brain's perception of synthesized frames. Having a stereoscopic play-through of every perspective a player/actor might navigate in a game or movie is exceedingly complex. So instead, Luke moves a single perspective through the trajectory, then having the display output jitter the camera slightly to the right and left in sequence. When right glimpse happens, input to the left eye pauses. Then the alternate glimpse is shown to the left eye while right eye output pauses. You can envision this by blinking your right, then left eye while looking at your finger in front of your face. Each eye sees more behind the close object's edges than the other eye in that instant. So objects near appear to hover close to you against the background, which barely moves at all.
Vast landscapes of Final Fantasy VII appear more realistic with parallax depth rendering. https://www.theverge.com/2022/8/10/23300463/ffvii-remake-intergrade-pc-vr-luke-ross-mod |
The effect, when you perceive it for the first time, astounds you with how realistic the portrayed landscape becomes. It's like having a 3D IMAX in your home to experience this with a VR headset. The exciting thing is that game designers and directors don't have to rework their entire product to allow this to be possible. AER can be done entirely post-production. It is still a fair bit of work. But much more feasible to achieve on grand scale than rendering all legacy media anew in 3D VR stereoscopic view. This makes me believe that it will be a short matter of time before this will be commonly available to most readers of my blog. (Especially if I have anything to do with this process.)
You may not yet have a consumer VR headset at your disposal yet. But currently HP Reverb, Pico, Meta Quest, and HTC Vive are all cheaper than my 3D Vizio TV. The rendered experience of a 65 inch TV in your living room is still typically smaller in your field of view than a wide field of vision VR headset. So over coming years, many more people may opt for the nearer screen over the larger screen. When they do, more people will start seeking access 3D content which now, thanks to Luke, has a more scalable way to reach the market for this emerging audience.
No comments:
Post a Comment