Skip to main contentSkip to navigationSkip to navigation
Japanese toy maker Sega's pet dog robots
Sega’s pet robot iDog: as tech gets smarter, what role does human empathy still have? Photograph: Shizuo Kambayashi/AP
Sega’s pet robot iDog: as tech gets smarter, what role does human empathy still have? Photograph: Shizuo Kambayashi/AP

What will become of empathy in a world of smart machines?

This article is more than 8 years old

Artificial intelligence can only go so far. The future of technology will lie in the interplay of software and capable humans

At a recent conference in Malmö, Sweden, in front of a keynote-sized audience, two speakers presented a strange video. It was of a recently developed cybernetic dog, that moved, acted and gestured like the real thing. In an odd twist, people in the video shoved and kicked it.

It reacted, predictably, as a real dog would: moving, rebalancing, even flinching perceptibly. The system was determining what movements were necessary to stay upright. But the emotional magnitude was entirely on us, the humans.

The entire audience gasped. Then they realised that it was, in fact, odd to gasp. Another kick came. The same thing happened. Everyone felt something vivid, and disturbing.

It was one of these strange examples of two blurring worlds, playing out in front of us. Empathy flooded towards what actually was an unfeeling, man-made object.

We’re in a very interesting, in-between moment in technology. We can see the seams on the fastball: the connected home, previously idle devices now able to communicate with one another, artificial intelligence (AI) and the ability to sift data for big correlations. And of course, the virtual assistant embedded within your mobile phone.

But as we increasingly interact with software and interfaces to do many of the recurring things we do every day, it is interesting to think about the idea of empathy in interactions, and how the software layer we use will gradually start becoming empathetic to our needs.

One glitch in the matrix that portends larger things was revealed in a recent incident with Google’s self-driving car. One of Google’s experimental vehicles was at a four-way stop. A cyclist on a fixed-gear bike arrived just behind it at the intersection. While waiting for the car to go he performed a track stand, balancing the bike by rocking back with its own momentum.

In a post on roadbikereview.com user OxTox said:

“It apparently detected my presence (it’s covered in GoPros) and stayed stationary for several seconds. It finally began to proceed, but as it did, I rolled forward an inch while still standing. The car immediately stopped … I continued to stand, it continued to stay stopped. Then as it began to move again, I had to rock the bike to maintain balance. It stopped abruptly. We repeated this little dance for about two full minutes and the car never made it past the middle of the intersection.”

It was a telling short-circuit. The car didn’t know how to process this behaviour.

We’re seeing baby steps towards this idea of empathy in interactions. The interface is exhibiting empathy, as well as its designer. Anyone who has used the language-learning application Duolingo knows that it quickly adjusts based on what area you need practice in, seemingly on the fly. Rusty on the subjuntivo in Spanish? You’re going to get peppered with it.

But can machines be expected to be fully empathetic? Signs point to no. It is relatively easy to create a learning brain but we don’t yet know how to create a heart or a soul. In a recent talk at the New Yorker festival MIT Media Lab director Joi Ito asserted that “humans are really good at things computers are not.”

So perhaps the future lies not in sensational rise-of-the machines style narratives, but in the meaningful interplay between smarter software and capable humans.

We’re already starting to see it occur in hospitality, with services such as Alfred, where a human being orchestrates the growing number of on-demand services (Uber, Instacart etc) to get things done for you quickly, and in one place. Some airlines are arming their flight attendants with relevant customer data on mobile that will allow them to employ better service and recognition of frequent fliers in-person.

Machine learning may be the thing that defines “the next wave” of apps and services. It might take the form of a virtual assistant or just really good suggestions. Eventually we may not even notice as it’s a part of every major platform and mobile operating system.

But it would also appear that the human touch of warmth, empathy, and service still has a very welcome and important place in our lives. And when coupled with the software that’s “eating the world” it will lead to very interesting – and unexpected – places.

Colin Nagy is executive director at The Barbarian Group. Follow him on Twitter @CJN

To get weekly news analysis, job alerts and event notifications direct to your inbox,sign up free for Media and Tech Network membership.

All Guardian Media and Tech Network content is editorially independent except for pieces labelled “Brought to you by” – find out more here.

Comments (…)

Sign in or create your Guardian account to join the discussion

Most viewed

Most viewed