At the end of every year, IBM Research publishes its annual “5 in 5″, basically a list of five technologies that computer scientists believe will make the most impact in the next five years. Some of the past predictions that came true include a 2006 notion that we’d be bale to access healthcare remotely and use real-time speech translation. More often than not, however, these technology picks are merely entertaining guidelines and food for thought.
In the 2012 edition, IBM’s research arm calls for smartphones and computers with a sense of touch unlike any you’ve experienced before. Our gizmos will understand images and be more aware of their surroundings while providing a wide range of output via sensory and cognitive technologies, enabling, for example, the smell or taste of food.
With all due respect to current technology, our computers today are just large calculators”, IBM’s CTO of Telcom Research Paul Bloom says in the above clip. So, 2017 should be all about cognitive computing. Included after the break: five videos highlighting these interesting predictions…
Here’s how touch will come to life on smartphones, letting you feel the touch of the fabric and reproduce textures via vibrations and other technologies.
In the 1970s, when a telephone company encouraged us to “reach out and touch someone,” it had no idea that a few decades later that could be more than a metaphor. Infrared and haptic technologies will enable a smart phone’s touchscreen technology and vibration capabilities to simulate the physical sensation of touching something. So you could experience the silkiness of that catalog’s Egyptian cotton sheets instead of just relying on some copywriter to convince you.
Computer vision will get a significant boost as we train our devices to turn pictures and videos into features, identifying things like color distribution, texture patterns, edge information and motion information.
In 5 years, computers will hear what matters, scientists bet. Think devices that not only pick your voice, but the surrounding noise to better understand context. By detecting patterns and building models to decompose sounds, machines “will be used to predict when a tree might fall or to translate “baby talk” so parents understand if a baby’s fussing indicates hunger, tiredness or pain”.
In five years, your tablet will know what you’d like to eat better than you do. This will require digital taste buds of sorts and IBM is even calling for dedicated machines capable of experiencing flavor in order to determine the precise chemical structure of food and why people like it.
Smartphones are bound to gain this ability by way of some new sensors. Even today, we have things like dongles that add additional sensors to your iPhone, like weather sensors or sensors that pick measurements of electrical conductivity to determine how organic your food is.
The last one should be easy. We already have sensors that can measure the smell of any object. As the technology advances over the next five years, devices five years from now will be able to “smell potential diseases that feed back into a cognitive system to tell us if they suspect a possible health issue”, IBM predicts.
For more details, check out IBM’s 5 in 5 feature.
Feel free to add your own predictions and place your bets regarding IBM’s five picks.