Apple’s WWDC25 developer sessions are filled with intriguing insights that didn’t make it into the keynote or the State of the Union presentation. One session, briefly touched upon in “What’s new in SwiftUI”, may provide the first significant clues about the future direction of visionOS.
A few months back, Bloomberg‘s Mark Gurman reported that Apple is preparing to launch two new Vision Pro headsets. One is intended to be lighter and more budget-friendly than the existing model, while the other is described as a tethered device:
The other headset under development could be even more compelling. In January, I disclosed that Apple had halted work on augmented reality glasses that would connect to a Mac. Instead, the company is developing a Vision Pro that connects directly to a Mac. The distinction lies in the immersion—while the canceled device featured transparent lenses, the ongoing project will follow the same immersive design as the Vision Pro.
There’s no official timeline for a launch, but Apple may already be setting the stage for the tethered version.
For the first time, macOS Tahoe 26 apps will be capable of rendering 3D immersive content directly on the Apple Vision Pro, utilizing a new scene type called RemoteImmersiveSpace
.
Direct Transition from macOS to visionOS
This new functionality, highlighted in SwiftUI’s support for spatial computing, capitalizes on Apple’s introduction of the CompositorService
framework to macOS Tahoe 26.
This framework allows Mac applications running on macOS Tahoe 26 to project stereo 3D visuals directly into the Vision Pro’s environments, eliminating the need for a separate visionOS version.
With RemoteImmersiveSpace
, developers can now generate immersive graphics that respond to input events, such as taps or gestures, along with hover effects for spatial interaction, effectively enabling their desktop applications to expand into an immersive realm. This can be executed using SwiftUI, with deeper integration available through Metal for those seeking complete rendering control.
Additionally, the SwiftUI team introduced robust spatial layout and interaction APIs, empowering developers to craft volumetric user interfaces, facilitate object manipulation, such as picking up a virtual water bottle, and implement scene snapping behaviors for more dynamic user experiences.
This means a macOS application could accurately simulate complete 3D experiences, from architectural tours to scientific visualizations, and run them in real-time on the Vision Pro, powered by the Mac.
The outcome? A significantly lower barrier for macOS developers eager to explore Vision Pro or to begin crafting for a future where spatial computing is poised to become ubiquitous.
For further technical insights, refer to Apple’s “What’s New in SwiftUI” session and the additional resources on the Apple Developer website.