Recording spatial sound with ARKit & moving through virtual 3D waveforms

Code artist Zach Lieberman has leveraged Apple’s new ARKit framework in iOS 11 to build a trippy proof-of-concept spatial audio app which can be used to record sound in augmented reality and play it back by literally moving through virtual 3D waveforms.

Here’s a quick video showing this in action (be sure to turn on sound)

https://twitter.com/zachlieberman/status/905414787550642176

He built it using ARKit and OpenFrameworks, an open source C++ toolkit for creative coding.

I can see uses for this in museums as an interactive art installation of sorts: imagine recording musical compositions this way with notes and rhythms occupying spacial patterns that people could play back by moving through 3D waveforms.

One could even set up an installation or a dedicated fitness track where people could jog through a course in order to play pre-recorded music. Or, how about a Scavenger hunt in augmented reality where you must find audio clues all over the house?

The possibilities are endless—if you can do spatial audio with ARKit, it’s not a stretch to imagine augmented reality words, tweets, emoji, floating sheets of music and so forth.

Here are more demos from Zach showing how ARKit detects object points in real time.

Zach’s demo brings to mind The Light Barrier series by studio Kimchi and Chips, a project that creates volumetric drawings in the air using hundreds of calibrated video projections. As demonstrated in the video embedded below, these light projections merge in a field of fog to create graphic objects that animate through physical space as they do in time.

Earlier today, we showed you the ultimate measuring app by developer Rinat Khanov, called MeasureKit, that you can use to measure just about anything—like a person’s height, distance to objects, angle between two points and more—by pointing your iPhone at objects.

How do you like this new ARKit demos?