At its global developer conference, Apple announced a significant update to RealityKit, its suite of technologies that allow developers to begin building AR (augmented reality) experiences. With the release of RealityKit 2, Apple says developers will have more visual, audio and animation control when working on their AR experience. But the most notable part of the update is how Apple’s new Object Capture API will allow developers to create 3D models in minutes using only the iPhone.
In its address to the developers, Apple noted that one of the most difficult parts of creating great AR applications is the process of creating 3D models. This can take hours and thousands of dollars.
Then, using the MacOS MonteI object capture API, only a few lines of code are needed to generate the 3D model, Apple explained.
For starters, the developers would start a new photogrammetry session in RealityKit, which points to the folder in which they captured the images. They would then call the process function to generate the 3D model at the desired level of detail. Object Capture allows developers to generate USDZ files optimized for AR Quick Look, a system that allows developers to add virtual, 3D objects to iPhone or iPad apps or websites. 3D models can also be added to AR scenes in Reality Composer in Xcode.
Apple said developers such as Wayfair, Etsy and others are using Object Capture to create 3D models of real-world objects – an indication that online shopping is about to receive a major AR upgrade.
Wayfair, for example, uses Object Capture to develop tools for its manufacturers to create a virtual presentation of their products. This will allow Wayfair customers to be able to preview more AR products than they could today.
In addition, Apple noted that developers, including Maxon and Unity, use Object Capture to create 3D content in 3D content creation applications such as Cinema 4D and Unity MARS.
Other updates in RealityKit 2 include custom shaders that give developers more control over the rendering pipeline to fine-tune the look and feel of AR objects; dynamic loading of assets; the ability to build your own system of components to organize the assets in your AR scene; and the ability to create player-controlled characters so users can jump, scale, and explore AR worlds in RealityKit-based games.
One developer, Miko Haapoja from Shopify, tested the new technology (see below) and shared some real-world tests where he captures objects using the iPhone 12 Max via Twitter.
Developers who want to test it themselves can use Apple’s sample app and install Monterey on their Mac to try it out. They can use the Qlone camera app or any other imaging app they want to download from the App Store to take the photos they need for Object Capture, Apple says. In the fall, the accompanying Qlone Mac application will also use the Object Capture API.
Apple says there are more than 14,000 ARKit apps in the App Store today, created by more than 9,000 different developers. With more than 1 billion AR-supported iPhones and iPads used worldwide, he notes that Apple offers the world’s largest AR platform.