Tech That Tracks Your Every Move Can Be Convenient, Not Creepy

The time has come for interaction designers to get very intentional. If we’re not careful, the future could resemble the scarier parts of Minority Report, where roving droids scan our irises and force non-stop customized ads upon us. Avoiding that future -- while embracing the benefits of a seamless experience without violating trust -- is just as much a design problem as an ethical one.
Image Disney
Image: Disney

A new stadium is being built right now in California for the San Francisco 49ers, and it'll provide a level of responsiveness and connectivity never before seen in a sports venue. Fans will have a personalized and largely cashless experience. There will be multiple ways to view closeups and replays. Smartphones will direct fans effortlessly to their seats. There will even be ways to know which bathroom has the shortest line.

In a different context, these same experiences could become deeply creepy.

That’s just the short-term vision – and it isn’t just isolated to one tech-savvy stadium in California. A stadium that knows our food and drink preferences is technically possible now, and Citi Field, where the New York Mets play, is already taking steps in this direction. Spaces like this could become common all over the world as the usual suspects of ubiquitous and invisible computing – sensors, low-energy communications technology, embedded technology – continue to evolve. As it is, half of all Americans already own a smartphone (where, by the way, time spent on the internet just passed access via desktop). And in the long-term, the vision gets even more personal, bordering on wizardry: just think about Disney’s RFID-equipped Magic Band system, which not only allows users to unlock hotel rooms, pay for meals and souvenirs, and track where they’ve been … but also enables Disney characters to greet kids by name and much more.

This kind of personalization works in places like Disney World and, to a lesser degree, sports stadiums, because both are controlled, contained, seemingly safe environments. We enter them knowing we’ve sacrificed a certain amount of privacy in exchange for a unique experience. In a different context, these same experiences could become deeply creepy – imagine a random hot-dog vendor addressing you by name as you walk down a street in Manhattan. Foursquare already proactively pushes passive recommendations to users’ smartphones, creating what some have described as a “piece of magic that lives in your pocket.” For others, though, the experience is simply jarring.

This is why the time has come for interaction designers to get very intentional, not just in how data is guarded, but in how it is collected and what it is used for. If we’re not careful, the future could resemble the scarier parts of Minority Report, where roving droids scan our irises and force non-stop customized ads upon us. Avoiding that future – while embracing the benefits of a seamless experience without violating trust – is just as much a design problem as an ethical one.

We may soon see that these challenges are one and the same, but design helps ground the discussions in a new way. While the ethical aspects can become a politicized, polarizing debate about the tradeoff between privacy and personalization, design helps us navigate the many “shades of control” in between. It helps guide us down the two paths that diverge in the woods of location-specific user experience: the “dumb device, smart environment” approach (exemplified by the Disney Magic Band) and the “smart device, dumb environment” approach (exemplified by the technologies in our pockets or on our bodies).

Sean Madden

About

Sean Madden is Executive Managing Director at Ziba. Specializing in service design and innovation strategy, he leads multi-disciplinary design teams to create products and services for global Fortune 100 companies. Follow him on Twitter @smadden.


In the dumb device/smart environment approach, the user carries a simple identifying device and relies on interactive elements (like in a Disney park) – and well-trained staff – to do amazing things with the information the device gives them. This approach has the advantage of delivering a more coherent experience because it can be planned and controlled centrally, with minimal effort from the user. But it also demands a lot of faith from customers who have little control over how that information gets used. Smart environments can make people uncomfortable; Disney’s tracking efforts, for instance, have already prompted comparisons with the NSA.

In the smart device/dumb environment approach, the technology we carry – most likely a smartphone or a very capable wearable device – reads the information in an environment full of identifiers (“I’m a product”, “I’m a perimeter marker”, “I’m a point-of-purchase”, etc.), and interprets it using whatever software we’ve designated. This approach has the advantage of being entirely under the user’s control. But it lacks the palpable sense of magic that comes from being immersed in a coherently designed experience: Think of it as walking through a museum with a guidebook in hand, as opposed to being guided by a knowledgeable, passionate docent who knows what art you want to see.

This is just as much a design problem as it is an ethical one.

Eventually, these two paths will converge in a “smart device, smart environment” approach, where the environment has the capacity to offer a coherent experience, yet the user decides how deeply he or she wants to engage. While the dangers of getting this wrong are very real, there can be great benefits on both sides: The provider gets brand loyalty and the ability to accurately target offerings and promotions, while the customer gets greater convenience, more customization, and a bit of magic. We also don’t know yet what what’s possible when you give patrons a connected device and put them in a responsive environment. What people build on top of platforms always surprises us and can rarely be predicted.

While the convergence of these environments will provide us with the technical tools to do more, it still doesn’t solve the design problem. The question of whether a user wants to have a relationship with a particular stadium, concert hall, shopping mall or store becomes even more crucial than before. This scenario asks a lot of users, because the kind of relationship we’re describing here – where you tell me your preferences and background, and I change the way I act in order to accommodate them – has historically been limited to human-human interactions.

By bringing this kind of intimacy to a brand relationship, we’re essentially asking people to treat brands and their tools as trusted friends. Let me pause to emphasize that I don’t mean this in the social-media jargon sense: I mean that companies as entities become more intimate than even family members.

Unfortunately, large organizations haven’t done a great job over the past few years building this kind of trust where personal data is concerned. From identity theft scandals and compromised data to unscrupulous marketing and revelations of government snooping, we’re being steadily conditioned to see information sharing of any sort as a dangerous prospect. If a family of 49ers fans refused to enter the new stadium out of privacy concerns, we’d be less likely to accuse them of wearing tinfoil hats.

The kind of relationship where you tell me your preferences and background — and I change the way I act in order to accommodate them — has historically been limited to human-human interactions.

What’s needed now is a careful, intentional approach to design on par with the one that designers and entrepreneurs used to transform the web from a set of arcane technologies into a delightful, indispensable part of commerce and life. It’s folly to focus only on the data, and not on the people it measures.

This means we need to examine each new interaction from the perspective of the user, not just the brand: What is the transaction demanding, in terms of action and information, and what does it offer in return? Again and again we’ve seen that users vary widely in their willingness to relinquish privacy, but they all need to see a clear benefit for doing so.

It also means we must apply the practices of human-to-human interaction to human-to-machine – and human-to-brand – interaction in a more emotional, consistent, predictable way. Instead of legal CYA notices designers will need to provide active reassurances. Companies like Uber struggle here not because they don’t have a great or logical idea, but because they don’t try to understand the emotional design needs of users, not actively pushing information to them at every opportunity. It may sound like an unrealistic standard for a company to live up to the expectations of friends, but brands that ask for intimacy have no choice.

Finally, we have to relinquish control. This may sound strange in light of the Disney example – what environment could be more controlled than a theme park or unmoored cruise? – but Disney’s control is limited to what it provides … not what it demands. Every millimeter of the resort is scrupulously crafted, but visitors have ample opportunity to specify the experience they want, the degree of access they wish to grant, and how they want to spend their time once inside.

As retail and event venues become smart environments in the near future, designers will have to make opt-in the norm, rather than opt-out. Designing to nudge patrons towards a behavior means demonstrating its value, not removing or stripping away alternatives. It means signaling to users that “You are free to refuse” – not just in words, but in visual elements, user experience, and more.

Editor: Sonal Chokshi @smc90