Facebook for iOS will soon introduce AI-powered live effects for photos and videos

FAcebook AI photo video effects iPHone screnshoot 001

According to Facebook’s chief technology officer Mike Schroepfer, the company’s mobile applications for the iPhone, iPad, iPod touch and Android smartphones and tablets will soon bring out advanced Prism-like live filters and effects for your photos and videos.

Using artificial intelligence algorithms and high efficiency neural networks running directly on the device, these filters will let users turn their photos and videos into works of art, Schroepfer said.

The social network has revealed that the upcoming live photo and video effects make use of Caffe2Go, a new deep learning platform from Facebook which can capture, analyze and process pixels in real time on a mobile device.

The engineer reveals they were able to provide AI inference on some mobile phones at less than 1/20th of a second, which works out to 50 ms. By comparison, a human eye blink happens at 1/3rd of a second or 300 ms.

As per Schroepfer:

Just three months ago we set out to do something nobody else had done before: ship AI-based style transfer running live, in real time, on mobile devices.

This was a major engineering challenge, as we needed to design software that could run high-powered computing operations on a device with unique resource constraints in areas like power, memory and compute capability.

The result is Caffe2Go, a new deep learning platform that can capture, analyze and process pixels in real time on a mobile device.

The engineer explains they had to condense the size of the artificial intelligence model used to process media by a factor of hundred in order to be able to run deep neural networks with high efficiency on iOS and Android.

With Caffe2Go, applying effects to your photos, videos or even live videos is happening in real-time, right in the palm of your hand—similar to the real-time video and photo effects built into the stock Camera app on the iPhone.

The Caffe2Go framework is now fully embedded into Facebook’s apps.

“We can recognize facial expressions and perform related actions, like putting a ‘yay’ filter over your selfie when you smile,” noted Facebook.

To get a better sense of how these real-time filters work and can be easily controlled with hand gestures, check out the videos embedded in the source article linked right below or read this blog post to learn how they were able to accomplish all of this.

Source: Facebook