Sony, the biggest maker of camera chips used in phones, appears to be boosting production of next-generation 3D sensors after getting interest from phone makers including Apple.
BloombergQuint, which is an Indian business and financial news organization, a joint venture of Bloomberg News and Quintillion Media, has the story:
The chips will power front and rear-facing 3D cameras of models from several smartphone makers in 2019, with Sony kicking off mass production in late summer to meet demand, according to Satoshi Yoshihara, head of Sony’s sensor division.
He declined to provide sales or production targets, but said the 3D business is operating profitably and will make an impact on earnings from the fiscal year starting in April.
A boom in sophisticated camera sensors was pioneered by Apple’s successful commercialization of in-house designed, depth-based infrared sensors used in the TrueDepth camera found on the iPhone X, iPhone XS and iPhone XS Max models.
Sony isn’t the only maker of 3D camera chips—its clients also include Google and Samsung. Apple is currently sourcing its TrueDepth parts from supplier Lumentum.
The TrueDepth camera system takes advantage of the structured light approach: an infrared emitter projects a pattern of 30,000 invisible infrared dots onto your face, then an infrared receiver is employed to measure dot distortions in order to calculate a disparity map.
Among other things, Sony’s new 3D sensor allows for hand gestures
By comparison, Sony’s upcoming 3D sensor uses a technique called time-of-flight (ToF), which calculates the time pulses of light take to travel to and from a target.
Sony’s technology differs from the structured light approach of existing chips which have limits in terms of accuracy and distance. Sony uses a method called time of flight that sends out invisible laser pulses and measures how long they take to bounce back, which creates more detailed 3D models and works at distances of five meters. Other uses include mobile games, which could involve creating virtual characters that interact with and navigate real-world environments, or ones that use hand gestures for control.
While Apple analyst Ming-Chi Kuo doesn’t believe the next iPhone might adopt a ToF sensor, using it would permit Apple’s smartphone to recognize hand gestures.
Sony showed several examples using a custom phone with a 3D camera on its rear. In one app, users made specific hand gestures to cast magic spells inside a virtual game. In another, the phone calculated the depth of the room and accurately displayed a virtual goldfish swimming in-front of and behind real-life objects.
Aside from phone unlocking, accurate depth sensing is used for portraiture photography to separate the photo subject from the background, in augmented reality applications, to improve focus when taking pictures at night and so forth.
Apple could also use this new sensor from Sony in a high-end 8K augmented reality headset that it’s rumored to be working on for an introduction sometime in 2019 or 2020.