iOS 13 solves the eye contact problem on FaceTime video calls for improved intimacy

iOS 13 will make it appear as if you’re starring directly at the front-facing camera on a FaceTime video call even when you’re really looking away at the person on your screen.

As Dave Schukin explained on Twitter, a new FaceTime Attention Autocorrection setting that appeared in the third beta of iOS 13 and iPadOS solves the annoying issue which makes it appear like both FaceTime participants are peering off to one side of the screen or the other.

This happens because human beings inherently crave eye contact.

When you’re on a FaceTime video call, you’re looking directly at the person on the screen rather than the selfie camera. iOS 13 seems to have addressed that problem by using sophisticated image manipulation to fake eye contact between the FaceTime users.

According to Dave, iOS 13 takes advantage of the TrueDepth camera and ARKit, Apple’s framework for augmented reality apps, in order to grab a depth map and position of your face. Using that information, the system adjusts the eyes accordingly, in real-time.

The feature can be toggled on and off from within FaceTime’s settings.

As The Verge pointed out, Rundle himself theorized back in 2017 that Apple would one day do this, although not so soon. To achieve the seamless effect, your iPhone appears to warp the line across both the eyes and nose, Dave found out.

The feature appears to be limited to the iPhone XS and iPhone XS Max, Dave added, and I concur – I couldn’t find the FaceTime Attention Autocorrection switch neither in the FaceTime settings on my iPhone X running the latest iOS 13 beta 3 software nor on my previous-generation iPad Pro.

It’s unclear if subsequent betas might bring FaceTime Attention Autocorrection to older devices or if the feature could support group calling.

How do you like FaceTime Attention Autocorrection?

Let us know by leaving a comment below.