Apple empowers developers to build new apps for Vision Pro
Apple has announced the availability of new software tools and technologies that enable developers to create groundbreaking app experiences for Apple Vision Pro, Apple's first spatial computer.
Featuring visionOS, the world's first spatial operating system, Vision Pro enables users interact with digital content in their physical space using natural and intuitive inputs possible - eyes, hands and voice.
As of now, Apple's global community of developers will be able to create a new class of spatial computing apps that take full advantage Vision Pro and blend digital content with the physical world to enable new experiences.
With the visionOS SDK, developers can utilise the unique capabilities of Vision Pro and visionOS to design brand-new app experiences across a variety of categories including productivity, design, gaming and more, the company states.
Next month, Apple will open developer labs in Cupertino, London, Munich, Shanghai, Singapore and Tokyo to provide developers with hands-on experience to test their apps on Apple Vision Pro hardware and get support from Apple engineers. Development teams will also be able to apply for developer kits to help them quickly build, iterate and test right on Apple Vision Pro.
Susan Prescott, Apple's Vice President of Worldwide Developer Relations, comments, "Apple Vision Pro redefines what's possible on a computing platform. Developers can get started building visionOS apps using the powerful frameworks they already know, and take their development even further with new innovative tools and technologies like Reality Composer Pro, to design all-new experiences for their users.
"By taking advantage of the space around the user, spatial computing unlocks new opportunities for our developers, and enables them to imagine new ways to help their users connect, be productive and enjoy new types of entertainment. We can't wait to see what our developer community dreams up."
Developers can build new experiences that take advantage of the features of Apple Vision Pro by using the same foundational frameworks they already know from other Apple platforms, including technologies like Xcode, SwiftUI, RealityKit, ARKit and TestFlight, the company states.
These tools enable developers to create new types of apps that span a spectrum of immersion, including windows, which have depth and can showcase 3D content; volumes, which create experiences that are viewable from any angle; and spaces, which can fully immerse a user in an environment with unbounded 3D content.
To help developers optimise 3D content for their visionOS apps and games, an all-new tool available with Xcode called Reality Composer Pro lets them preview and prepare 3D models, animations, images and sounds, so they look good on Vision Pro, according to Apple.
Developers can also interact with their apps in the new visionOS simulator to explore and test various room layouts and lighting conditions. In addition, every developer framework comes with built-in support for Apple's accessibility features to ensure spatial computing and visionOS apps are accessible to everyone.
Starting next month, developers who have been building 3D apps and games with Unitys robust authoring tools can port their Unity apps to Apple Vision Pro and take full advantage of its powerful capabilities.
The visionOS SDK, updated Xcode, Simulator and Reality Composer Pro are available for Apple Developer Program members at developer.apple.com. Registered Apple developers have access to a variety of resources to help them design, develop and test apps for Apple Vision Pro, including extensive technical documentation, new design kits and updated human interface guidelines for visionOS.