I just walked out of a long demo session with Apple’s new $3,499 Vision Pro headset, which the company announced at WWDC 2023 as “the world’s most advanced consumer electronics device.” It’s… a really really nice VR headset with impressive displays and video passthrough. And I mean incredibly impressive displays and video passthrough: I was happily using my phone to take notes while wearing the Vision Pro, something no other headset can realistically allow.
That said, while Apple would obviously prefer that people think of the Vision Pro as a “powerful spatial computer” or an augmented reality device, there’s really no getting around the essential VR headset nature of the thing, down to the adjustable headstraps which definitely messed up my hair. It looks, feels, and behaves like a VR headset. If you’ve used a Meta Quest, just imagine the best possible Meta Quest running something very much like iPadOS, and you’ll get it.
Apple held Vision Pro demos in a large white cube-shaped building it built for WWDC called the Fieldhouse. Upon entry, I was handed an iPhone for a quick setup process: a turn-your-face-in-a-circle scan (very much like the Face ID setup that determined what size face mask to use), and then another side-to-side face scan that looked at my ears to calibrate spatial audio. After that, Apple had me visit an “vision specialist” who asked if I wore glasses — I was wearing my contacts, but glasses-wearers had a quick prescription check so Apple could fit the Vision Pros with the appropriate lenses. (The lenses are made by Zeiss; Apple needed a partner that can legally sell prescription lenses. They snap in magnetically and will be sold separately at launch.)
The headset itself weighs a little less than a pound — it’s connected by a braided white power cable to a silver battery pack that offers about two hours of use. The cable detaches from the headset with a mechanical latch, but it’s permanently connected to the battery pack. If you want to plug into the wall you plug a USB-C adapter into the battery pack.
The design language is all brushed aluminum, shiny glass, and soft fabrics; the vibe is closer to iPhone 6 than iPhone 14. That glass on the front is an obviously complex piece of optical engineering: it is perfectly curved but still serves as an appropriate lens for the cameras and the OLED screen that shows your eyes when you’re looking at people. (This feature is called EyeSight; I didn’t get to try it in any way.)
Around the headset itself you’ll count 12 cameras, a LIDAR sensor, and a TrueDepth camera, as well as IR flood illuminators to make sure the cameras can see your hands in dark environments for control purposes. The whole thing runs on a combination of Apple’s M2 and new R1 processors, which unsurprisingly generate a fair amount of heat. The Vision Pro vents that heat by pulling air up through the bottom of the device, and venting it out the top.
The top of the Vision Pro has a button on the left that serves as a shutter button to take 3D videos and photos, which I didn’t get to try. The Digital Crown is on the right; clicking it brings up the home screen of app icons, while turning it changes the level of VR immersion in certain modes. I asked why anyone would want to set the immersion level anywhere other than all-on or all-off, and it appears Apple is thinking of the middle immersion setting as a sort of adjustable desktop workspace for apps while leaving the sides open for you to talk to your colleagues.
When you put on the headset, there’s a quick automatic eye adjustment that’s much quicker and more seamless than on something like the Quest Pro — there are no manual dials or sliders for eye settings at all. Apple wouldn’t say anything specific about its field of view this long before launch, but I definitely saw black in my peripheral vision. The Vision Pro is not as totally immersive as the marketing videos would have you believe.
The display itself is absolutely bonkers: a 4K display for each eye, with pixels just 23 microns in size. In the short time I tried it, it was totally workable for reading text in Safari (I loaded The Verge, of course), looking at photos, and watching movies. It is easily the highest-resolution VR display I have ever seen. There was some green and purple fringing around the edges of the lenses, but I can’t say for certain if that was down to the quick fitment or early demo nature of the device or something else entirely. We’ll have to see when it actually ships.
The video passthrough was similarly impressive. It appeared with zero latency and was sharp, crisp and clear. I happily talked to others, walked around the room, and even took notes on my phone while wearing the headset — something I would never be able to do with something like the Meta Quest Pro. That said, it’s still video passthrough. I could see pretty intense compression at times, and loss of detail when people’s faces moved into shadows. I could see the IR light on the front of my iPhone futilely blink as it attempted to unlock with FaceID to no avail. And the display was dimmer than the room itself, so when I took the headset off my eyes had to adjust to how much brighter the room was in reality.
Similarly, Apple’s ability to do mixed reality is seriously impressive. At one point in a full VR Avatar demo I raised my hands to gesture at something, and the headset automatically detected my hands and overlaid them on the screen, then noticed I was talking to someone and had them appear as well. Reader, I gasped. Apple’s also gotten a lot farther with eye tracking and gesture control: eye tracking was pretty solid, and those IR illuminators and side cameras mean you can tap your thumb and index finger together to select things while they’re down in your lap or at your sides. You don’t need to be pointing at anything. It’s pretty cool.
Apple has clearly solved a bunch of big hardware interaction problems with VR headsets, mostly by out-engineering and out-spending everyone else that’s tried. But it has emphatically not really answered the question of what these things are really for yet: the main interface is very much a grid of icons, and most of the demos were basically projections of giant screens with very familiar apps on them. Safari. Photos. Movies. The Freeform collaboration app. FaceTime video calls. There was one demo with 3D dinosaurs where a butterfly landed on my outstretched hand, but that was as much “augmented reality” as I really experienced. (Yes, mapping the room and projecting the displays is very complex AR work, but there wasn’t so much as a measuring-things app after years of ARKit demos at WWDC. It was odd.)
I did get to see a quick FaceTime call with someone else in a Vision Pro using an AI-generated 3D “persona” (Apple does not like it when you call them “avatars”) which was both impressive and deeply odd. It was immediately obvious that I was talking to a persona in an uncanny-valley sort of way, especially as most of the person’s face was frozen apart from their mouth and eyes. But even that much was convincing after a while, and certainly much nicer than your average Zoom call. You set up a persona by holding the headset in front of you and letting it scan your face, but I wasn’t able to set one up myself and there’s clearly a lot of refinement yet to come, so I’ll withhold judgement until later.
All of this was basically a greatest hits reel of VR demos, including some old standbys: Apple showed off 180-degree 3D videos with spatial audio in something called the Apple Immersive Video Format, which the company apparently shot with proprietary cameras it may or may not release. (They looked like the 3D videos we’ve been seeing in VR demos forever.) I looked at a 3D photo of some cute kids shot by the headset’s cameras and watched a 3D video of those kids blowing out a birthday candle. (Same.) I did a one-minute Mindfulness meditation in which a voice commanded me to be grateful while the room darkened and a sphere of colorful triangles expanded all around me. (This looked great, but Supernatural exists, has millions of users on the Quest, and has offered guided meditation since 2020.) And I watched Avatar in what looked like a movie theater, which, well, that’s one of the oldest VR demos ever.
Was all this made better by the wildly superior Vision Pro hardware? Without question. But was it made more compelling? I don’t know, and I’m not sure I can know with just a short time wearing the headset. I do know that wearing this thing felt oddly lonely. How do you watch a movie with other people in a Vision Pro? What if you want to collaborate with people in the room with you and people on FaceTime? What does it mean that Apple wants you to wear a headset at your child’s birthday party? There are just more questions than answers here, and some of those questions get at the very nature of what it means for our lives to be literally mediated by screens.
I also know that Apple still has a long list of things it wants to refine between now and next year when the Vision Pro ships. That’s part of the reason it’s being announced at WWDC: to let developers react to it, figure out what kinds of apps they might build, and get started on them. But that’s the same promise we’ve been hearing about VR headsets for years now, from Meta and others. Apple can clearly outpace everyone in the industry when it comes to hardware, especially when cost is apparently no object. But the most perfect headset demo reel of all time is still just a headset demo reel — whether Apple’s famed developer community can generate a killer app for the Vision Pro is still up in the air.