Claustrophobia and barely contained panic were my two overriding emotions as I experienced Apple’s first-ever immersive narrative short film on its Vision Pro mixed reality headset.
The aptly titled Submerged, which arrives on Vision Pro headsets around the world today, tells the harrowing 17-minute tale of a World War II submarine tasked with tracking ships in enemy waters.
If you’re even remotely a World War II history buff or film fan, you’ve heard of or seen tales like this before but I doubt you’ve seen it in this fashion.
Written and directed by award-winning director Edward Berger, the film tells a tight tale of a submarine crew under siege. It’s remarkable not necessarily because the story is new or nuanced but because of how it employs immersive video techniques to put you in the middle of the action.
When I watched the film, I could look around the tight cabin – an expertly built set that, owing to the Vision Pro’s high-resolution displays and the close confines – used all metals to make the recreation remarkably realistic.
The story starts slowly to, perhaps, help introduce you to the main protagonist, crewman James Dyson (expertly played by Jordan Barton), his shipmates, and the mundanity of their existence some 400 ft below the surface of the sea.
One thing I noticed was how Berger switched back and forth between using the 360-degree view capabilities of a VR headset (all shot on Apple’s proprietary camera) like the Vision Pro, and pulling the focus in tightly to direct your gaze. The closeups of Barton were tight enough that I could make out the pores on his face.
I often find the experience of watching a movie in Vision Pro calming almost to the point of sleepiness. As the early minutes of the short film plugged along, I found my attention waning – that is, until the ship was struck by something and shook wildly. I was so startled that I almost jumped off the couch I was sitting on.
From that moment forward, the film seemed to press in on me and its characters. I enjoyed Berger’s varied propositioning of the camera. At one point a 20-ft torpedo was essentially loaded into my chest – at least from my point of view. At another, the main character was staring warily right into my eyes.
When one of the torpedo tubes burst open and flames and sparks formed overhead, all hell broke loose. Within seconds there were geysers of water shooting into my and the main character’s faces.
As the water poured in and rose around us – yes, I soon felt like a part of this – my unease grew. It was clear the water was just below my nose. I’m a little claustrophobic in real life so I was soon repeating a mantra of “no no no no” in my head. To achieve some of this effect, they shot the film in a giant custom-built tank and in open water where they slowly submerged some parts of the set.
There’s virtually no exposition so I never entirely understood how [spoiler alert] the enemy found them, attacked, and eventually destroyed their sub. Miraculously, all the crewmen survived. This being an Apple immersive film, perhaps I shouldn’t be surprised.
Apple is still mainly interested in delivering relatively brief immersive experiences. It’s queued up a new NBA All-Star weekend movie that, while just four minutes or so long, truly puts you in the middle of the three-day event. I don’t even follow basketball but found it entertaining (the basketball to my face was a nice touch).
Apple is also lining up more Adventure and Elevated episodes, including one for Maine. There are more concert experiences on the way, like one from The Weekend and another from R.A.E.
I’ve enjoyed most of my immersive experiences and, if you have a Vision Pro, it’s one of the coolest ways to use your mixed-reality headset. Are they and this entertaining and somewhat anxiety-inducing film reason enough to spend $3,500 for a Vision Pro? I’ll leave that up to you.
I naturally had a lot of questions about the creation of this immersive, short film, so I turned to the director Edward Berger, who also directed the Oscar-winning All Quiet on the Western Front, for a deeper look at how he created Submerged and the choices he made to bring it to Vision Pro. Our conversation has been edited and shortened for clarity.
A conversation with the director
I saw the behind-the-scenes video short, and what I noticed is that during the making of the film, you appeared to be wearing the Vision Pro headset. So I guess I was curious if that was designed so you would have a feed and know exactly what the scene was going to play out like for people who were wearing headsets themselves.
Berger: In our shooting experience, we designed the pipeline so that we were able to watch the take through the Vision Pro and experience life as the actors were doing it.
I started changing my habit a little bit during production because you learn your brain rewires very quickly to the Apple Vision Pro. It learns the tools, it learns the visuals, it learns the techniques, very quickly how it feels, what what you can use. And so maybe halfway through the shoot, I sort of alternated between just watching the actors on my monitor, [we had] two monitors. One was just the field division and one was the entire 180 degrees sort of whatever was in the frame everywhere.
So, I watched those just to – sometimes it felt a little bit more direct to me and I could imagine the effect that I would have in the Apple Vision Pro because I had learned it within the previous three weeks of using this tool.
Were there any technical surprises as you were preparing to do this or even as you were doing it?
Absolutely. I mean, not surprises so much because we tested it extensively, but certain things that we knew, we’re gonna have to push the limit here in terms of movement, camera movement. How much can you shake it? What kind of dolly moves, crane moves can you make? Can you do a three-axis move, or is it better just to move on one axis? Just do push-ins or trackbacks or things like that.
So just all these things you need to test to know how is it gonna feel when you put on the put on the glasses, but also, you know, very quickly it became clear to us, okay, well where are we gonna hide the equipment? You know, where are we gonna put the lights? Where are we gonna put the microphones? Very quickly we realize, okay, we’re gonna have to integrate everything into the set.
The set looked great, by the way. It looked looked realistic. Had you ever shot a VR film before?
No. First experience, first stereoscopic experience. I’d never even put on any VR glasses. I mean, except for in a museum, maybe for a few minutes, when there’s a long line behind a view, so it was a wonderful way of getting to the technology diving into it, exploring ways of telling a story in a different way.
Do you now own a Vision Pro?
Absolutely, and I use it a lot. It’s a great great way to watch movies.
How long was the entire shoot?
I think we shot for 10 days. Tested maybe for a week and then shot for 10 days.
It sounds like it took more planning maybe than a traditional movie because of things like having to hide equipment.
Yeah, it did take a lot of planning. It took quite a while – but you know, also we had a tight timeline. So in a way, it was like we had to work around the clock to make this film to be able to premiere it right now for you. I mean, we shot it in April and it has a long post-production process. Yeah, a lot of planning, but a lot of around-the-clock work as well.
Did you get to show it to Tim Cook?
I mean, I know he watched it. I wasn’t present in the room when he watched it.
With the amount of preparation and the way you had to manage things, was there any room for improvisation?
No. I’m not, in general, I’m not a big fan of improvisation. I like movies where I can see that the filmmaker has put a lot of thought into the design of it, into the making of it. To then react on the day, I don’t call improvisation where I said, okay, let’s make it better. We have this shot planned, but actually it would be much better from here with the actors doing this, let’s react to that. But I like when movies are well thought through and precise and so I’m not the right person to ask that question.
I’m sure there’s somebody else, an improvisational filmmaker who will put the Apple Vision Pro to great use.
I noticed in the shooting in the movie that sometimes I could look around and see a lot of stuff, and other times my view was directed, narrowed in a way focused. So I was just curious how you decided to make that choice where sometimes we were experiencing it fully and sometimes we were just like – the main character was really in our face and that was kind of what you saw. If you look to the edges, they were kind of darkened.
It is. I mean, it’s in a way like in a traditional movie, I would say it’s the use of a closeup. When you say, okay, I really want it to have an emotional impact for the viewer, have them be very close to our character to experience what they are experiencing. And so you direct the eyes, the gaze like that, but usually, I mean, this device, the great thing is that the audience can design their own experience.
It’s almost like theater in a way. You have very wide shots, not your field of vision necessarily, but to the left, right, up and down, there’s a lot beyond the frame that you can see, and hear, and to then use that entire frame, the entire 180 degrees to fill it with sound or other action like a bursting pipe or steam coming somewhere, was super interesting to us, and therefore you also need more time to explore these things. You can let the shots linger so that you, Lance, can look around, you know, that we can give you the time and space to do that.