Apple patents stylus that scans objects in 3D and simulates textures via haptic feedback

Apple stylus concept Martin Hajek 006

KGI Securities analyst Ming-Chi Kuo is adamant that Apple is working on an advanced stylus accessory for a rumored 12-inch ‘iPad Pro,’ an assumption based in no small part on patent applications Apple has filed for in years past.

The latest indication that the iPhone maker is indeed researching styli technology comes in the form of a new patent application published Thursday by the United States Patent and Trademark Office (USPTO) which proposes an advanced texture-sensing stylus design with a built-in camera.

Titled “Touch implement with haptic feedback for simulating surface texture,” the invention describes a stylus input device with a built-in camera and accompanying electronics.

Not only would the solution allow the accessory to sense contact with a touchscreen, but detect on-screen textures over which it is passed. It would then simulate the underlying texture through Taptic Engine-like vibratory feedback.

Apple texture sensing stylus patent drawing 001

An array of sensors like contact sensors, capacitive sensors, pressure sensors, photodiodes and onboard cameras could be used to analyze on-screen textures like paper, wood and glass, for example.

Captured data would be sent wirelessly to a host device via Wi-Fi or Bluetooth. The companion software on a host device would analyze and interpret the data and tell the stylus to produce corresponding vibratory feedback.

Different feedback profiles could be used dynamically, depending on the type of surface texture detected. The system would even produce auditory cues and haptic feedback would respond to parameters such as the writing pressure, angle or orientation.

Such a texture-sensing stylus of sorts would be suitable for a wide range of applications, including gaming, painting and image editing applications, CAD and 3D modeling and more. It might even be used to enhance iOS devices’ Accessibility features.

For instance, in one embodiment Apple explains that visually impaired users might be able to actually fell on-screen images as they move the stylus across the screen.

Apple texture sensing stylus patent drawing 002

Another related patent, titled “Texture capture stylus and method,” details a stylus with a built-in camera embedded into its tip for detecting physical characteristics of any surface, not just the images shown on computer screens. The Livescribe 3 Smartpen, for example, also uses an onboard camera but works only on specialized Livescribe Dot Paper.

Apple’s solution employs an embedded photo sensor to capture light bounced off an object, allowing the device to actually reproduce a three-dimensional representation of an object together with image textures, shapes and colors.

Apple texture sensing stylus patent drawing 003

“The stylus includes an image sensing and capture device to permit a surface to be scanned using the stylus and an image may be stored and displayed on an electronic device to represent the texture of the scanned surface,” reads the document.

A similar patent application was filed for in December 2014, describing a stylus accessory that would permit a user to write on any surface, and translate it onto their iPhone or iPad.

What do you think about the aforementioned inventions? Would pairing haptic feedback to a stylus make sense at all?

Apple stylus concept by 3D artist Martin Hajek.

Source: USPTO