Imagine a world where computers smell, taste, see, hear and touch. Well, that time is coming. If you saw the computer named Watson take top honors on the TV quiz show Jeopardy!, you’ve already seen a computer that can understand and respond to humans.
In the next age of computing — which we at IBM are calling the era of cognitive systems — hardware and software will gain amazing new human brain-like capabilities to learn, adapt and sense that will fundamentally improve the way people live, work and interact with each other.
Every year IBM makes predictions about five technology innovations that stand to change the way we live within the next five years. This year we’re focusing on the ability of machines to — in their own special way — emulate humans’ five senses.
While that may sound implausible, consider what today is de rigueur (smartphones, tablets) didn’t exist even a decade ago. Scientists and engineers continue to push advances in a variety of fields that are further improving and broadening the capabilities of computers. Those involving cognitive systems and their potential for advancing numerous aspects of the human experience — from agriculture and healthcare to utility management and weather forecasting — are quite impressive.
Very soon, computers will be able to see … and extract minute data from medical MRIs, CT and X-ray images, or even assess photo images of sunspots on skin. A “seeing” computer’s assessment, analysis and recommendation will be one of the tools a doctor uses to prescribe timely and effective treatment.
Because cognitive systems are good at detecting patterns, utility companies will one day employ vision-oriented sensors to scan photos taken in the aftermath of hurricanes and posted to social media sites. Data gathered by this process may help prioritize the deployment of repair crews to areas where safety and security risks are most acute.
Computers will also be able to hear … and detect the movement within a massive neighborhood oak tree that may foreshadow its impending demise. Alerting an arborist that the tree needs to be trimmed or cut down — before it comes crashing down — could save lives and protect property.
If a computer could “hear” or detect how wind is changing direction during a wildfire, firefighters could pinpoint their next steps in containing the blaze before it spreads further. Cognitive computing will help us hear and understand what is important to us, like the difference between a baby’s cry when it’s hungry versus when it’s seeking comfort.
When computers are able to simulate touch, we will have solved one of the most difficult tasks facing cognitive computing researchers. By its very nature, touch is a physical experience. But with infrared and haptic, or tactile feedback, technologies, we have already begun to simulate the feeling of touch – usually vibration patterns associated with physical objects – in the gaming industry.
Soon, online merchants will be using haptic technology to let customers “touch” merchandise before purchasing it. The texture of an article of clothing could be simulated when the shopper is prompted to brush his or her finger over the item’s onscreen photo. Is it silky or rough? Soon, we’ll know.
The senses of taste and smell are often related, especially when it comes to food. Computers will one day be able to surprise us with customized pairings of foods that are designed to maximize our favorite tastes, flavors and textures. They will even be able to suggest the best pairings to minimize hunger and optimize the nutritional values of available foods.
Having a powerful sense of smell can also keep people safer. A computer can be put to work at major art museums sniffing out gases undetectable by the human nose that can damage major works of art — the same innovation that can detect unsafe levels of air pollution on the streets of major cities around the world.
In the era of cognitive computing, machines will no longer be limited to deductive reasoning, or drawing conclusions from more generalized data. They will emulate our ability to use inductive reasoning, which is based on specific, contextual observations. We are on the verge of being able to benefit from computers that can understand our experiences and then take action to create an output that improves the way we live, work and play.
Researchers don’t expect computers to replace human functionality altogether. There was a reason that the Jeopardy-winning computer Watson was named after a real human being and competed with actual humans. The true success of cognitive computing will not be judged by its ability to replace the functions of the human brain, it will be the innovations it unleashes by providing people with a better quality of life and the vital information needed to free up creative problem solving as we tackle our toughest challenges.
Image: woodleywonderworks/flickr
Paul Bloom is the CTO for Telecom Research and is responsible for applying the latest IBM technologies and research from its twelve laboratories to emerging telecom solutions.