A few tinkering developers have actually used the facial data collected from the iPhone X’s TrueDepth camera system to create 3D models of their face. One was even able to 3D print his face, though with questionable results.
In a short video, visual effects artist Elisha Hung was able to use the data collected from the TrueDepth camera to create a 3D floating head, able to mimic his movements. He utilized the iPhone X’s camera, then used ARKit and Xcode to fetch that data, and later transform it into full 3D renders.
While that actually looks pretty great, it does take the effort of a visual effects artist to achieve that outcome. The actual imagery collected by the iPhone’s camera is far less precise.
It doesn’t collect quite enough points to make such a realistic model, so it takes a lot of effect to smooth out the data into something that looks more like a face. Brad Dwyer, founder of game company Hatchlings, showed what an actual frame looks like as collected by the TrueDepth camera.
Here’s another scan. Cleaned up a little. Probably a better representation of what it “sees”
— Brad Dwyer (@braddwyer) November 15, 2017
In this form, it is a whole lot blockier.
Still, they’ve shown what impressive feats can be achieved using the latest technology built into Apple’s phones. Since developers are able to take advantage of these APIs, it will be interesting to see what developers do with this in the future.