Alongside previewing the new iOS 11.3 features and releasing the forthcoming new software for early developer testing yesterday, Apple also announced that app makers can now take advantage of ARKit 1.5, its framework for building augmented reality apps for iOS.
ARKit 1.5 supports these features:
- Horizontal plotting—In addition to horizontal surfaces like tables and chairs, ARKit can now recognize and place virtual objects onto vertical surfaces, like walls and doors, and more accurately map irregularly shaped surfaces like circular tables or chairs.
- Improved scene understanding—ARKit can see and place virtual objects on vertical surfaces and more accurately map irregularly shaped surfaces like circular tables.
- 2D images—ARKit apps can now detect 2D images in your real world, like signs, posters and artwork, which can be integrated into the augmented reality experience.
- Higher resolution—The pass-through camera view of the real world has 50% higher resolution, going from 720p to full HD for a more believable experience.
- Sharper view—The view of the real world through your iPhone’s camera is now sharper as well because ARKit 1.5 brings support for the auto-focus feature.
- Tidbits—ARKit now supports line detection while the overall tracking has been improved in speed and accuracy.
The most interesting new feature is ARKit 1.5’s ability to turn posters, signs and artwork into interactive augmented reality experiences.
Using advanced computer vision techniques to find and recognize the position of 2D images such as signs, posters, and artwork, ARKit can integrate these real world images into AR experiences such as filling a museum with interactive exhibits or bringing a movie poster to life.
As an example, an app could fill a museum with interactive content based on detected images.
TechCrunch imagines a game that lets you throw darts at a wall, with the target mounted to an actual wall rather than floating in space. “Apple’s version of wall detection will detect planes that are vertical or just off vertical but not heavily angled initially,” the publication added.
TechCrunch enthused about the new possibilities:
The implications down the road are even more exciting.
What happens, for instance, when you can slap a sticker on a wall which can act as a marker that ARKit can recognize without external libraries? It can then project out objects or scenes based on that marker.
And some of the back end systems that I know other developers are working on rely on computer vision to create a persistent spatial map that can be used to ‘re-place’ objects or scenes very precisely between augmented reality sessions or between different people at different times.
This will help with those.
The Loop’s Jim Dalrymple described an ARKit 1.5 demo Apple gave him:
In one demo Apple showed me, a developer can make a game that requires the player to bounce a ball on the floor and hit a virtual target hung on the wall. ARKit 1.5 makes the target on the wall possible. If you miss the target, ARKit will recognize that and bounce the ball off the wall.
You walk into a museum and see an image of the Apollo moon landing. When you hold up your iPhone, it recognizes the image and allows you to tap on it, which instantly takes you to the surface of the moon with the Apollo vehicle landing beside you.
ARKit released just six months ago.
According to Apple, App Store has more than 2,000 ARKit-enabled apps.