Apple’s New Vision Pro Has Big Ambitions

A woman wearing the Apple Vision Pro.
Apple Vision Pro is a mixed-reality headset — which the company hopes is a 'revolutionary spatial computer that transforms how people work, collaborate, connect, relive memories, and enjoy entertainment' — that begins shipping to the public (in the United States) later this week. (Image: via Apple)

Apple Vision Pro is a mixed-reality headset — which the company hopes is a “revolutionary spatial computer that transforms how people work, collaborate, connect, relive memories, and enjoy entertainment” — that begins shipping to the public (in the United States) later this week.

Critics have doubted the appeal of the face-worn computer, which “seamlessly blends digital content with the physical world,” but Apple has pre-sold as many as 180,000 of the US$3,500 gizmos.

What does the company think people will do with these pricey peripherals? While uses will evolve, the company is focusing attention on watching TV and movies, editing and reliving “memories,” and — perhaps most importantly for the product’s success — having its customers not look like total weirdos.

The company hopes the new device will redefine personal computing, like the iPhone did 16 years ago and Macintosh did 40 years ago. But if it succeeds, it will also redefine privacy concerns, as it captures enormous amounts of data about users and their environments, creating an unprecedented kind of “biospatial surveillance.”

Spatial computing

Apple is careful about its brand and how it packages and describes its products. In an extensive set of rules for developers, the company insists the new headset is not to be referred to as a “headset.” What’s more, the Apple Vision Pro does not do “augmented reality (AR), virtual reality (VR), extended reality (XR), or mixed reality (MR)” — it is a gateway to “spatial computing.”

Spatial computing, as sketched out in the 2003 PhD thesis of U.S. software engineer Simon Greenwold, is: “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces.” In other words, the computer can interact with things in the user’s surroundings in real time to provide new experiences.

The Vision Pro comes with an app that lets users get up close and personal with dinosaurs.
The Vision Pro comes with an app that lets users get up close and personal with dinosaurs. (Image: via Apple)

The Vision Pro has big shoes to fill for new user experiences. The iPhone’s initial “killer apps” were clear: the Internet in your pocket (including portable access to Google Maps), all your music on a touch screen, and “visual voicemail.”

Sixteen years later, all three of these seem unremarkable. Apple has sold billions of iPhones, and 80 percent of humans now use smartphones. Their success has all but killed off earlier tools like paper maps and music CDs (and the ubiquity of text, image, and video messaging has largely done away with voicemail itself).

Killer apps for Vision Pro

We don’t yet know what the killer apps of spatial computing might be — if any — but Apple is pointing our attention here.

The first is entertainment: the Vision Pro promises “the ultimate personal theatre.”

The second is an attempt to solve the social problem of walking around with a weird headset covering half your face. An external goggle screen shows a constantly updated representation of your eyes to offer critical social cues about your gaze to those around you. Admittedly, this looks weird. But Apple hopes it is less creepy and more practical than trying to interact with humans wearing blank aluminum ski goggles.

Reliving memories with the Apple Vision Pro.
Reliving ‘memories’ with the Apple Vision Pro. (Image: via Apple)

The third is the ability to capture and relive “memories;” recording and playback of 3D visual and audio from actual events. Reviewers have found it striking:

This was stuff from my own life, my own memories. I was playing back experiences I had already lived.

Apple has patented tools to select, store, and annotate digital “memories.” These memories are files, and potentially products, to be shared in “spatial videos” recorded on the latest iPhones.

Biospatial surveillance

There is already a large infrastructure devoted to helping tech companies track our behavior to sell us things. Recent research found thatFacebook, for example, receives data from an average of around 2,300 companies on each individual user.

Spatial computing offers a step change to this tracking. Spatial computing records and uses vast amounts of intimate data about our bodies and surroundings to function.

One study on headset design noted 64 different streams of biometric and physiological data, from eye tracking and pupil response to subtle changes in the body’s electromagnetic field.

Your face tomorrow

This is not “consumer” data like the brand of toothpaste you buy. It is more akin to medical data.

For instance, analyzing a person’s unconscious movements can reveal their emotional state or even predict neurodegenerative disease. This is called “biometrically inferred data,” as users are unaware their bodies are giving it up.

Apple suggests it won’t share this type of data with anyone, and they have proven better than most companies regarding privacy. But biospatial surveillance puts more of ourselves in use for spatial computing in ways that are expanding.

It starts simply enough in the pre-order process, where you need to scan your facial features with your iPhone (to ensure a snug fit). But that’s not the end of it.

Apple’s patent about memories is also about how to “guide and direct a user with attention, memory, and cognition” through feedback loops that monitor “facial recognition, eye tracking, user mood detection, user emotion detection, voice detection, etc. [from a] bio-sensor for tracking biometric characteristics, such as health and activity metrics […] and other health-related information.”

Social questions

Biospatial surveillance is also the key to Apple’s attempt to solve the social problems created by wearing a headset in public. The external screen showing a simulated approximation of the user’s gaze relies on constant measurement of the user’s expression and eye movement with multiple sensors.

An external screen shows a representation of the user’s eyes.
An external screen shows a representation of the user’s eyes. (Image: via Apple)

Your face is constantly mapped so others can see it — or rather, see Apple’s vision. Likewise, as passersby come into range of the Apple Vision Pro’s sensors, the company’s vision of them is automagically rendered into your experience, whether they like it or not.

Apple’s new vision of us — and those surrounding us — shows how the requirements and benefits of spatial computing will pose new privacy concerns and social questions. The extensive spatial surveillance that captures intimate biometric and environmental data redefines what personal data and social interactions can be exploited.

Luke Heemsbergen, Senior Lecturer, Digital, Political, Media, Deakin University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Follow us on XFacebook, or Pinterest

  • Troy Oakes

    Troy was born and raised in Australia and has always wanted to know why and how things work, which led him to his love for science. He is a professional photographer and enjoys taking pictures of Australia's beautiful landscapes. He is also a professional storm chaser where he currently lives in Hervey Bay, Australia.

RECOMMENDATIONS FOR YOU