Apple has announced the availability of new software and tools that are going to enable developers to build apps for the upcoming Apple Vision Pro.
According to their blog, Apple released the software development kit (SDK) on the 21st of June, following their announcement of the Vision Pro that will go on sale later next year.
Another tool that comes in the SDK is the VisionOS simulator, which developers will use to test various layouts and lighting while developing.
Hyping the product
Developers will now get access to the company’s accessibility tools to make sure their VisionOS applications are appropriate for all users.
According to Apple, the VisionOS software development kit (SDK) “enables Apple’s developer community to bring their apps to life in ways never before possible.”
According to TechCrunch, Apple’s release of the SDK ahead of the product’s availability early next year is a bid to improve excitement towards their Vision Pro, which did not spark as much attention as they might have wanted during the launch at the WWDC earlier this month. This move could ensure there will be loads of applications for users to explore when the product comes to market.
“Developers can get started building VisionOS apps using the powerful frameworks they already know and take their development even further with new innovative tools and technologies like Reality Composer Pro, to design all-new experiences for their users,” said Apple’s vice president of worldwide developer relations Susan Prescott.
“By taking advantage of the space around the user, spatial computing unlocks new opportunities for our developers, and enables them to imagine new ways to help their users connect, be productive, and enjoy new types of entertainment,” added Prescott.
“We can’t wait to see our developer community dreams up.”
Also read: Improbable’s Decentralized Metaverse to Challenge Apple and Meta
Creating new experiences
Developed over the same foundational frameworks that Apple has been using like Xcode, SwiftUI, RealityKit, ARKit, and TestFlight, developers will be able to create new experiences what will work well with the VisionPro.
“These tools enable developers to create new types of apps that span a spectrum of immersion, including windows, which have depth and can showcase 3D content; volumes, which create experiences that are viewable from any angle; and spaces, which can fully immerse a user in an environment with unbounded 3D content,” said Apple in their blogpost.
An addition to Xcode is the Reality Composer Pro, which will allow developers to create and preview 3D models, animations, images and sounds before they can be used on VisionPro.
According to MacRumors, there are a dozen environments one may choose from including the Joshua Tree, Mount Hood and even the moon.
Apple has also created a Travel Mode that can be used when on an airplane while a Visual Search feature will be able to recognize surrounding objects, copy printed text from the real world, translate languages in real time, and more.
Access privy to some selected cities
Apple will open access to their developer labs to developers in selected cities and these are Cupertino, London, Munich, Shanghai, Singapore, and Tokyo, to provide hands-on experience app testing and also provide support from Apple Engineers.
The company has also added that developers will also be able to apply for developer kits that will enable them to build, iterate and test right on Apple Vision Pro.
“Starting next month, developers who have been building 3D apps and games with Unity’s robust authoring tools can port their Unity apps to Apple Vision Pro and take full advantage of its powerful capabilities,” said Apple.
Developers who have already previewed the VisionOS SDK and IPS are enthusiastic about the product, according to the company.