Putting VR, Volumetric Video in Refresh Mode
When you attend film festivals like Sundance, Toronto, Tribeca and most recently Cannes; all of the attention is on:
- The important (and self-important) people
- The budget-buster screeners
- The budget-scrimping shows – short/long
- The important (and self-important) people
- Tomorrow's opportunities
It's easy to see that everyone will have her/his private entertainment channel to watch content in 8K, then 16K HDR.
The roadmaps are pretty clear, and the mass merchants are ready to push aggressively, even though folks are just slowly beginning to appreciate 4K ... almost everywhere.
But what's around the corner, just over the hill?
What will filmmaking look like in 5–10 years?
At every film festival, there's a corner of the lot roped off for the immersive future, VR (virtual reality).
The mass entertainment market prophets keep telling us every year it's going to be huge ... in two years.
Friends like Ted – a gaming industry expert – keep hollering, "It isn't hot, sexy; won't be and in fact it was stillborn."
Others, like Jon and Mark – who play endless hours of VR games (they're testing the stuff), tell us it's just getting better and better.
Virtual pros don't have much time to chat at film festivals; but at the IBC Future Zone, the cream of the crop will be able to focus on detailing what they have been doing to push the frontiers of creativity and technology to give us a dose of real reality.
We're gonna listen!
Naysayers like to point out that today's VR content is usually only 1–10 minutes long because ... you get sick!
They don't realize that in the 1890s when British Photographer Eadward Murbridge developed zoopraxiscope – primitive film devices, the projects were only 1–10 minutes long because the work was tough to do and people probably ... got sick.
It wasn't until the 1920s that films were one hour plus in length and the industry released about eight films a year that were shown in tough–to–find cinemas.
Today, we have thousands of short and long films a year produced and shown everywhere on the planet.
Not long ago, we sat through 3 hours, 2 minutes of the Avengers: Endgame ... twice!
The goal for creatives has been to help people identify/connect with the main character(s) and make the film/show their own.
VR does that, is doing that, will do that ... one segment, one viewer at a time.
Despite all of the hype, negativism and excitement, it's still very new as storytellers leverage technology to help people be a part of and control their content.
At Tribeca, producer/screenwriter Andrew Cochrane and cinematographer Andrew Shulkind captured the Tribeca X Award for the 12–min The 100%, the true story of Maggie Kudirka, a ballet dancer, rising star at the Joffrey Concert Group diagnosed with incurable stage four metastatic breast cancer.
Both are recognized volumetric/VR filmmakers and Shulkind has become widely known for his "painterly use of lighting."
Like many in the field, he began in the real/unreal world of visual–effects heavy workflows and a solid tech foundation.
To deliver what isn't there, Shulkind designed his own 360–degree camera array system –– uncompressed 23K–resolution, 12–bit Headcase Cinema Camera.
Seemingly never satisfied, Shulkind finds the present VR lighting constraints to be the hardest part of the art form.
Since the viewer can see in all directions, he's currently experimenting with new ways to "naturalize" the medium rather than rely on the top–lit experience common in present VR films.
In Cannes, Diego Prilusky, Intel Studios' general manager, said his group has been intent on not just having viewers connect with the character but actually create and intensify the experience for them whether its with a brand, sporting event or film.
With the world's largest volumetric studio at his team's disposal and in cooperation with Paramount Pictures, they produced Grease 40th Anniversary – Immersive Cinema Experience, one of several immersive projects they unveiled at Cannes.
Joining Intel with the acquisition of capture specialist Replay Technologies, the visual effects talent has focused on developing and refining what he calls "Data Powered Entertainment."
"Today's digital films are still frontal," said Prilusky who has done visual effects work on projects including Gravity, War Horse, and Prince of Persia. "We're still dealing with a flat pallet; and even when you're viewing a 360–degree video, you look around but it's still frontal.
"With volumetric video, all these things are broken," he explained. "There's no background or a number of arranged cameras, but a large number of different kinds of cameras that digitize reality, which is where data powered entertainment comes in."
For example, with the 40th anniversary, Grease 96 high–definition 5K cameras captured the dance scene. Data from each camera was shaped into voxels (3D pixels) and rendered in multi–perspective 3D. The viewer can experience the content from any given point of view – even in the middle of the action.
To deliver the experience, the raw video from the many cameras was processed on Intel–powered servers at the rate of 6TB/min.
Power and performance are going to become as important as the light touch of creativity for volumetric/VR and it will cross fertilize into the entertainment we experience on any/every screen.
Industrial Light & Magics Experience Lab (ILMxLAB) is an excellent example of what this crossbreeding will deliver to more and more viewers.
Their artists, engineers, game designers, sound designers and storytellers leverage traditional filmmaking graphics technology to create interactive, immersive cinema experiences around such tentpoles as Jurassic Park, Star Wars, and VR projects like Trials of Tatooine.
It sounds, feels and looks perfectly natural but they developed hardware and techniques to render massive 3D models in milliseconds rather than days; refined realistic surround sound for the rumbles, whooshes of starships and put lightsabers in the hands of Jedi–in–training viewers.
An even bigger challenge they are working on today for folks who just want to get away from it all is turning movies like Star Wars into portals for virtual/augmented worlds.
Telco service provider Verizon is also planning for a virtual tomorrow with the launching of their 5G full motion and volumetric capture and production studio in LA as well as new or upgraded studios in New York, London, Paris, Singapore, Hong Kong, Brazil and Australia.
In addition to rolling out their superfast, reliable 5G broadband and wireless service, the company is also intent on becoming a valuable resource for producing and delivering streaming content to people everywhere.
Jonathan Montaos of RYOT, Verizon Media's content studio/innovation lab, said the facilities were going to help revolutionize creative production and break down the production pipeline for real–time creative capture and production and have already produced several AR and VR Emmy award nominees.
While Verizon is slowly building out the 5G network infrastructure, Montaos noted RYOT wants to be ahead of the immersive experience curve by fostering and refining new content formats for the future. "We see a time when the audience will directly interact with all types of content as creativity and technology merge," he stated.
"We are nowhere near where it's going to be," acknowledged Wallace. This also means that no one really knows which apps and services will ultimately be successful on the service, which she argued was another reason for experimenting with the technology early on. "I don't think anyone predicted Facebook when 4G came around," Wallace added, "and the future isn't that far away."
A few years ago, VR was largely divided into two groups:
- Animated content produced with game engines that allowed high–end VR headset users the ability to lean into scenes and walk around computer–generated characters
- 360–degree videos with real actors, essentially films projected on a sphere around the viewer, without the ability to lean in.
It's slowly becoming a natural extension of the viewer's activities and experiences. Given a few more years, thanks to the creative/technical expertise of hundreds of people like those we've mentioned, earlier immersion will emerge.
Their goal is to make the belief structure so real you can't tell if something is really happening or not and that will require new generations of hardware, tools and workflows in the hands of creatives who know how to deliver an open–ended story platform and time ... actually, a lot of time.
At IBC's Future Zone, we'll see a lot of improvements in these areas this year because people can already envision what their content experience will be like in the years ahead.
VR and volumetric video have made a lot of progress in recent years in capturing moving images of the real world––people and objects that can be viewed and interacted with. And, as 5G completes its global roll–out around 2025; it will be better, more fun, more fulfilling and will become another option in our total educational, informational, entertainment library.
What will it look like, feel like, be like five years from now?
Not even the technical creatives who work with VR and volumetric video every day really know.
But they do know it's a helluva' lot of fun improving, doing, being on the leading edge of the wave.
As folks told us at Sundance, Toronto, Tribeca and Cannes, "If anybody tells you they know, they're probably trying to sell you something."
Of course, that will be tough at IBC ... they can't sell on the show floor!