You’re reading the Apple Newsroom

Vision Pro tidbits: Setup process, Pro apps, eye-tracking, and more

Omar Moharram
Omar Moharram - Senior Editor
8 Min Read

Apple’s Senior Vice President of Worldwide Marketing, Greg Joswiak, and Vice President of the Technology Development Group, Mike Rockwell, sat down with the Daring Fireball‘s John Gruber as part of his annual The Talk Show Live, which took place last Wednesday, June 7.

Rockwell joined Joswiak for a 40-minute segment of the show, which lasted two hours. During this segment, the executive duo revealed some interesting tidbits surrounding Apple Vision Pro and its development which spanned approximately seven years. The full interview, also includes Senior Vice President of Software Engineering, Craig Federighi, and Senior Vice President of Hardware Engineering, John Ternus.

We’ve rounded up the most interesting and notable Vision Pro tidbits from the show below. The full nearly two-hour-long show is available to watch on YouTube.

Initial Setup Process

Apple has yet to disclose details on Vision Pro’s initial set-up process, and until the headset launches early next year, we’ll have to wait to see what the experience is actually like. Thankfully, some details on that process were discussed during the interview. According to Rockwell and Gruber, to facilitate the headset’s hands-tracking system, the wearer will be asked to hold up their hands in the air for a few seconds in front of Vision Pro while wearing it. Their hands will be aligned against a virtual outline to set up machine-learning algorithms for the hand-gesture navigation of visionOS.

Another similar task is for the wearer to fine-tune Vision Pro’s eye tracking. A series of white dots against a completely dark background appear after wearing Vision Pro for the first time. Users will be asked to look at each white dot by only moving their eyes while keeping their heads completely still. Once the tasks are done, users are greeted with a simple passthrough of their environment with no other software elements.

Navigating visionOS

The passthrough video is described as extremely lifelike, with its perspective exactly matching what someone would typically see with their eyes. Rockwell states that Apple’s cornerstone goal with Vision Pro is to “build an incredibly powerful tool, not a toy.” He goes on to describe some of the advanced engineering that went into Vision Pro and visionOS.

For example, Apple had to develop an “entirely new glyph rendering mechanism” for 3D rendering as part of visionOS, allowing users to easily view and read undistorted text from any angle. visionOS also incorporates a new real-time subsystem that constantly monitors all input data from Vision Pro’s dozen cameras and sensors. This captured data is fed through the subsystem into Vision Pro’s new R1 chip, which executes a highly specialized set of machine-learning algorithms. These algorithms power features such as accurate object occlusion when users place their hands forward in front of displayed content, and completely stable app windows placement even with aggressive head movement.

Apple Vision Pro inside look at M2 and R1 chips
new R1 chip powers eye and hand tracking gesture system – APPLE

Rockwell adds that the spatial stability of app windows and visionOS in general is not compromised by unsteady environments such as in an airplane. Vision Pro’s Inertial Measurement Unit (IMU) is advanced enough to counteract a plane’s movements while flying.

When asked about Vision Pro’s lack of external controllers, Rockwell responded that Apple had always wanted to create a controller-free gesture system with enough precision to allow users to use other productivity-first input devices like keyboards if their use case requires so. The freedom that a gesture-based system grants also enables users to interact with their phones or other devices naturally without putting down the headset or worrying about hand-wielded controllers.

EyeSight

Rockwell describes EyeSight as an essential part of the Vision Pro experience, adding that Apple sought to make it easy to still connect with others in the user’s proximity while wearing the headset. The highly-intricate system – powered by the first-ever curved lenticular display in a consumer electronics device – does more than merely display a simple video stream of the wearer’s eyes.

Vision Pro generates multiple views of the wearer’s eyes for every person looking from the outside, even at an angle. Those rendered views are continuously updated in accordance with the outsiders’ movements, ensuring a constant and realistic video stream of the wearer’s eyes that makes Vision Pro appear as if it’s “transparent,” adds Rockwell.

Software Features

When asked about professional apps availability on Vision Pro, Rockwell pointed out the level of hybrid collaboration enabled by the headset. The expansive canvas of Vision Pro will enable levels of collaboration that are not possible on traditional computers with fixed screen sizes. Follow Along, for example, a feature in Apple’s Freeform app that allows users to track what area of the whiteboard other collaborators are currently working on will be coming to Vision Pro.

Rockwell also touts the ability of Vision Pro to serve as an external monitor for Macs, even acting as the sole monitor for display-less Macs like the Mac Studio. In addition to running apps specifically made for visionOS, the headset will also be able to run iOS and iPad apps with little to no extra modification from developers. This means that Final Cut Pro and Logic Pro for iPad should be able to run on Vision Pro without the need to run them from a Mac via external monitor mode.

Price and Availability

While Vision Pro was anticipated to go on sale before the year’s end, Apple instead announced that the headset is set to be released in early 2024. Joswiak and Rockwell declined to state a more concrete release date when asked during the interview, jokingly promising that Vision Pro would go on sale no later than what Apple announced earlier.

Vision Pro will start at $3,499 when it goes on sale in the United States. It’s still unclear if Apple will sell multiple storage configurations of its headset or whether it plans to bundle the optional ZEISS Optical Inserts prescription lenses that magnetically attach inside Vision Pro for those who wear glasses at an additional price.

TOPICS:
Share this Article

Editor's Pick

Supercharged is not just another news outlet. We’re a platform on a mission to offer personalized and ad-free news directly to you. Discover more of Supercharged.

You’re reading the Apple Newsroom

  • Loading stock data...

Vision Pro tidbits: Setup process, Pro apps, eye-tracking, and more

Omar Moharram
Omar Moharram - Senior Editor
8 Min Read

Apple’s Senior Vice President of Worldwide Marketing, Greg Joswiak, and Vice President of the Technology Development Group, Mike Rockwell, sat down with the Daring Fireball‘s John Gruber as part of his annual The Talk Show Live, which took place last Wednesday, June 7.

Rockwell joined Joswiak for a 40-minute segment of the show, which lasted two hours. During this segment, the executive duo revealed some interesting tidbits surrounding Apple Vision Pro and its development which spanned approximately seven years. The full interview, also includes Senior Vice President of Software Engineering, Craig Federighi, and Senior Vice President of Hardware Engineering, John Ternus.

We’ve rounded up the most interesting and notable Vision Pro tidbits from the show below. The full nearly two-hour-long show is available to watch on YouTube.

Initial Setup Process

Apple has yet to disclose details on Vision Pro’s initial set-up process, and until the headset launches early next year, we’ll have to wait to see what the experience is actually like. Thankfully, some details on that process were discussed during the interview. According to Rockwell and Gruber, to facilitate the headset’s hands-tracking system, the wearer will be asked to hold up their hands in the air for a few seconds in front of Vision Pro while wearing it. Their hands will be aligned against a virtual outline to set up machine-learning algorithms for the hand-gesture navigation of visionOS.

Another similar task is for the wearer to fine-tune Vision Pro’s eye tracking. A series of white dots against a completely dark background appear after wearing Vision Pro for the first time. Users will be asked to look at each white dot by only moving their eyes while keeping their heads completely still. Once the tasks are done, users are greeted with a simple passthrough of their environment with no other software elements.

Navigating visionOS

The passthrough video is described as extremely lifelike, with its perspective exactly matching what someone would typically see with their eyes. Rockwell states that Apple’s cornerstone goal with Vision Pro is to “build an incredibly powerful tool, not a toy.” He goes on to describe some of the advanced engineering that went into Vision Pro and visionOS.

For example, Apple had to develop an “entirely new glyph rendering mechanism” for 3D rendering as part of visionOS, allowing users to easily view and read undistorted text from any angle. visionOS also incorporates a new real-time subsystem that constantly monitors all input data from Vision Pro’s dozen cameras and sensors. This captured data is fed through the subsystem into Vision Pro’s new R1 chip, which executes a highly specialized set of machine-learning algorithms. These algorithms power features such as accurate object occlusion when users place their hands forward in front of displayed content, and completely stable app windows placement even with aggressive head movement.

Apple Vision Pro inside look at M2 and R1 chips
new R1 chip powers eye and hand tracking gesture system – APPLE

Rockwell adds that the spatial stability of app windows and visionOS in general is not compromised by unsteady environments such as in an airplane. Vision Pro’s Inertial Measurement Unit (IMU) is advanced enough to counteract a plane’s movements while flying.

When asked about Vision Pro’s lack of external controllers, Rockwell responded that Apple had always wanted to create a controller-free gesture system with enough precision to allow users to use other productivity-first input devices like keyboards if their use case requires so. The freedom that a gesture-based system grants also enables users to interact with their phones or other devices naturally without putting down the headset or worrying about hand-wielded controllers.

EyeSight

Rockwell describes EyeSight as an essential part of the Vision Pro experience, adding that Apple sought to make it easy to still connect with others in the user’s proximity while wearing the headset. The highly-intricate system – powered by the first-ever curved lenticular display in a consumer electronics device – does more than merely display a simple video stream of the wearer’s eyes.

Vision Pro generates multiple views of the wearer’s eyes for every person looking from the outside, even at an angle. Those rendered views are continuously updated in accordance with the outsiders’ movements, ensuring a constant and realistic video stream of the wearer’s eyes that makes Vision Pro appear as if it’s “transparent,” adds Rockwell.

Software Features

When asked about professional apps availability on Vision Pro, Rockwell pointed out the level of hybrid collaboration enabled by the headset. The expansive canvas of Vision Pro will enable levels of collaboration that are not possible on traditional computers with fixed screen sizes. Follow Along, for example, a feature in Apple’s Freeform app that allows users to track what area of the whiteboard other collaborators are currently working on will be coming to Vision Pro.

Rockwell also touts the ability of Vision Pro to serve as an external monitor for Macs, even acting as the sole monitor for display-less Macs like the Mac Studio. In addition to running apps specifically made for visionOS, the headset will also be able to run iOS and iPad apps with little to no extra modification from developers. This means that Final Cut Pro and Logic Pro for iPad should be able to run on Vision Pro without the need to run them from a Mac via external monitor mode.

Price and Availability

While Vision Pro was anticipated to go on sale before the year’s end, Apple instead announced that the headset is set to be released in early 2024. Joswiak and Rockwell declined to state a more concrete release date when asked during the interview, jokingly promising that Vision Pro would go on sale no later than what Apple announced earlier.

Vision Pro will start at $3,499 when it goes on sale in the United States. It’s still unclear if Apple will sell multiple storage configurations of its headset or whether it plans to bundle the optional ZEISS Optical Inserts prescription lenses that magnetically attach inside Vision Pro for those who wear glasses at an additional price.

TOPICS:
Share this Article
Secured By miniOrange