Apple's Vision Pro tells us why extended reality really matters

Clovis McEvoy
June 6, 2023

Apple’s long-awaited headset has arrived with unrivalled hardware and an unrivalled price. But the Vision Pro aims to do more than wow. Clovis McEvoy analyses the tech, the experience, and what else is needed to create a computing revolution.

First came personal computing, then mobile computing, and now, the company which dominated the latter era hopes to pioneer the next: spatial computing. After seven long years of rumours, and a reportedly contentious development process, Apple’s Vision Pro headset finally made its debut at the company’s annual developer conference on Monday.

It would take months to unpack everything announced during the 50-minute introduction for the Vision Pro, but one thing is clear — mixed, or augmented,reality never looked so good. Sporting a single piece of laminated glass curved into an aluminium frame, the headset has a striking design, though it is most set apart by what lies under the hood.

Offering an unparalleled viewing experience, each eye gets a video feed with more pixels than a 4K TV and what the company calls a “wrap around” field of view, powered partly by a custom-designed computer chip and an admittedly-ungainly external battery pack. Tech journalist after tech journalist after tech journalist walked away stunned by Apple's latest technical and experiential marvel.

Apple's Vision Pro headset is much sleeker than competing offerings, and comes with an external iPhone-sized battery pack.

With a $3,499 price tag (£2,800), top-notch hardware is to be expected. More interesting than the specs is the vision for a truly ‘mixed’ reality experience, underpinned by Apple’s brand new operating system, visionOS, which the Cupertino company says is designed from the ground up for spatial computing.

It’s an indicator of how Apple hopes to make the hardware category a defining feature of its product ecosystem. When the device starts shipping in 2024 (initially only in the US), it will come with usability features and experiences built around your eyes, hands, and voice. You can look at a search field and talk to begin entering text, for example, or use your hands to swipe from one window to another.

A core feature of the Vision Pro is how the device uses data to anticipate and delight users far more extensively than any other computing platform. “AI models [in the headset] are trying to predict if you are feeling curious, paying attention, remembering a past experience, or some other cognitive state,” tweeted Sterling Crispin, a former Apple researcher who worked on the Vision Pro.

The device accesses eye tracking, heart beats, blood density, and more “to predict how focused or relaxed you are [for example], and then update[s] the virtual environment to enhance those stats,” Crispin wrote. “Imagine an adaptive immersive environment that helps you learn, or work, or relax by changing what you’re seeing and hearing in the background.”

Apple's Vision Pro demos focused on work and the office.

That paradigm-shifting vision illustrates where the company wants to take the Vision series in the future. VisionOS will include its own App Store, doubtless prompting developers en masse to rush towards building apps tailored specifically for the headset, though the device will also run “hundreds of thousands of familiar iPhone and iPad apps.”

In particular, Apple demoed use cases in productivity, entertainment, and media, with users able to dial between a fully-immersive experience and one with floating icons and windows layered atop real-world spaces. Disney Plus, for example, will be available from day one, Disney CEO Bob Iger revealed during the show.

Apple oriented the Vision Pro around office life.

In contrast to other headsets, what the Vision Pro emphasises is the ability to fine tune your immersion to suit your needs, from watching a film or re-living a memory, to a family member or colleague needing your attention. VisionOS can bring them into focus without disrupting your experience. Central to this is not only the physical dial on top of the headset that lets you transition from an immersive experience to an augmented overlay, but also the Vision Pro’s front-facing screen which lets other people see a real-time 3D render of the user behind the headset.

Labelled ‘EyeSight’ by Apple, it is a well-intentioned effort to enable interpersonal eye contact and reduce the antisocial nature of fully enclosed headsets, though in practice it stumbles up the ‘uncanny valley’, looking decidedly weird and eerily human for now. It’s a feature that will divide opinion until the kinks are fully ironed out.

Apple's EyeSight feature makes XR headsets more personable and less antisocial.

What was most intriguing about the Vision Pro’s debut was its user experience — mainly due to what was missing, rather than what was present. Certainly, the ability to navigate menus using only your body, whilst not unique to Apple’s design, appears to have been refined. And sure, there were beautifying touches like floating windows responding to ambient light, casting shadows as if they had a real physical form.

What was surprising was how little of the demo footage emphasised the creative applications for which Apple is widely admired. Primarily, Apple oriented the Vision Pro around office life, with family life a close second. That’s a surprise: there are plenty of high quality extended reality painting, sculpting, and even music making apps available on other headsets like Meta’s Quest series. The lack of similar examples for the Vision Pro were notable, though perhaps an indication of who the target market is for a product at this price point.

Indeed, fully immersive, 3D content was conspicuous by its absence, with Apple keen to focus its extended reality vision as a layer on top of the real world, rather than a substitute for it. The company demoed a handful of 3D examples, but the distinct majority of them revolved around 2D screens with the real world behind it; screens that could be grabbed, resized, and repositioned, certainly, but traditional screens nonetheless.

Much was made of the device, for example, as a home cinema, or for experiencing photos and re-living past memories. Gaming also made a brief appearance, but notably the games shown were designed for Apple devices or existing consoles — Apple showed a gamer using the device with a PlayStation 5 controller, but with the screen simply a floating 2D window. It’s unclear how much this advances beyond a large TV screen.

The Vision Pro comes from wrap around video and integration with the rest of the Apple ecosystem, but note the battery cord.

There has always been a tension in mixed reality hardware development: power vs functionality. With an external battery that lasts only two hours, Apple has emphasised the latter, perhaps recognising that right now there is little reason to use the device for more than a few hours at a time anyway. Nonetheless, the company will expect to push down the price and improve the power issues as it refines the product over time.

The other major future investment will come in third-party software. Alongside a full suite of tools to grow and support its developer ecosystem, the company plans to open Vision Pro developer labs across North America, Europe, and Asia. As well as creating a place for developers to demo their apps live, Apple needs a range of experiential apps to let customers appreciate what extended reality (XR) can truly offer. 

To a remarkable extent, extended reality is a ‘word of mouth’ technology. Images, videos, specs, and yes, written articles full of flowery language, all fail to communicate what XR really offers. The company said it was working with game development platform Unity to make it easier for developers to create content for the Vision Pro, and that some had already created Vision Pro apps which would let users view a “personal planetarium” of deep space imagery and help designers collaborate on Formula 1 vehicles.

Those kinds of high quality experiences and use cases will be critical for making the Vision Pro a category-defining series. Much like how the iPhone took off in parallel with the diverse apps that people accessed from the App Store, the Vision Pro will be what developers make of it. 

“Imagine an immersive environment that helps you learn, work, or relax by changing what you’re seeing in the background.”

— Sterling Crispin, former Apple researcher

There are two ways people will be convinced to invest in this technology, the first of which is experience. When XR apps hit their mark, they can go far beyond what any other medium offers: to a place of visceral and emotive experiences. Cultivating and supporting creators to build these very experiences is a foundation which any extended reality ecosystem needs. One category that was noticeably absent from the Vision Pro, for example, were fitness experiences, a popular segment on Meta’s Quest headsets.

The second is something Apple already has: trust. Convincing people to strap a computer to their face is no small task, and more often than not, what convinces people to take that leap is a person or organisation that they respect. Long before Apple’s announcement, a 2021 survey found that three times as many Americans would prefer to buy an XR device from Apple than the second-place choice, Google; Meta came a distant sixth, behind Samsung, Amazon, and Microsoft. When Apple says that spatial computing is the future, it comes with a lot more authority than anyone else.

Whether the Vision Pro evolves into a home entertainment hub, a productivity device, or a creative powerhouse will largely depend on the developer ecosystem that grows up around it. But despite many unanswered questions, Apple just gave the XR market a much-needed shot of adrenaline — and this is just the start.

Click to view our article about art.
Written by
Clovis McEvoy
Click to view our article about art, music, film, and storytelling..
More about
Technology
Click to view our article about art.
More about
Impact

Clovis is a New Zealand born writer, journalist, and educator working at the meeting point between music and technological innovation. He is also an active composer and sound artist, and his virtual reality and live-electronic works have been shown in over fifteen countries around the world.

Suggested