Apple snaps up machine learning startup focused on dark data

Apple has snapped up an artificial intelligence and machine learning startup, called Lattice Data, for a reported $200 million. They’ve built an inference engine which turns so-called dark data into structured data sets that can be analyzed easily. Dark data is data stored in computer networks that cannot be analyzed directly because it’s not in a proper format.

The acquisition is valued in the ballpark of $200 million.

The deal could bolster Apple’s AI efforts and help its software turn things like text and images into structured items that can then be analyzed in traditional manners to derive insights. Apple has confirmed the acquisition with its standard boilerplate message issued to TechCrunch, saying it buys smaller technology companies from time to time.

Apple and Lattice did not immediately return a request for comment.

About 20 engineers from Lattice have now joined Apple. A source said that Lattice had been “talking to other tech companies about enhancing their AI assistants,” including Amazon’s Alexa and Samsung’s Bixby.

As per the story, which cited an anonymous source, the deal closed a few weeks ago.

The Menlo Park, California headquartered startup was co-founded in 2015 by Christopher Ré, Michael Cafarella, Raphael Hoffmann and Feng Niu as the commercialization of DeepDive, a system created at Stanford to extract value from dark data.

Company CEO is Andy Jacques, a seasoned enterprise executive who joined last year.

“Lattice turns dark data into structured data with human-caliber quality at machine-caliber scale,” according to the official Lattice website. “We model the known as features and the unknown as random variables connected in a factor graph.”

Lattice’s DeepDive framework has been used successfully in a diverse set of projects, ranging from a DARPA-funded human trafficking program to geology and paleontology to medical genetics, pharmacogenomics and more.

According to the website:

Data quality is in the DNA of Lattice. Our goal is not just to match human-level quality, but also to do so at unprecedented speed and scale. We build systems that win competitions and outperform expert readers.

We continuously push the envelope on machine learning speed and scale with our bleeding-edge systems research. For years, we have been building systems and applications that involve billions of webpages, thousands of machines and terabytes of data.

We can only speculate as to how Apple plans to apply Lattice’s technology to its products.

It’s probably safe to assume that Apple could improve object and scene recognition across its Photos service and the accompanying apps. More important than that, Lattice technology could be used to realize iPhone 8’s rumored camera augmented reality features while giving Siri the ability to analyze text and images in Messages.

A recent patent application suggested potential Siri integrations with the iMessage platform. Aside from Messenger-like chatbot functionality for Siri in Messages, Apple’s invention could let users, say, ask Siri to send an image of a Volkswagen Beetle to a contact.

Lattice’s framework could also help enhance Apple’s neural networks and machine learning.

That’s because unlike traditional machine learning, Lattice does not require laborious manual annotations. In taking advantage of domain knowledge and existing structured data to bootstrap learning via distant supervision, Lattice solves data problems with data.

Apple’s HealthKit, ResearchKit and CareKit frameworks may benefit from Lattice tech, too.