Skip to main contentSkip to navigationSkip to navigation
‘In much the same way as we look left and look right before crossing the road, we should pause prior to providing our data without a tacit understanding as to how it is to be used’
‘In much the same way as we look left and look right before crossing the road, we should pause prior to providing our data without a tacit understanding as to how it is to be used’ Photograph: Guardian Design Team
‘In much the same way as we look left and look right before crossing the road, we should pause prior to providing our data without a tacit understanding as to how it is to be used’ Photograph: Guardian Design Team

We need to build a new social contract for the digital age

This article is more than 5 years old

We are conditioned to view data as a threat, but it can be the opposite, if all parties understand the deal into which they are entering

“Raise your hands if you trust Facebook, if you trust Google, if you trust government.” It was spring 2017, and I was leading a debate with young people in Canberra. “Has anybody heard of Cambridge Analytica?” Heads shook.

I explained behavioural communication. How Cambridge Analytica built “psychographics” from Facebook data to measure personality and motivation – what you choose and why you choose it – with the intention of influencing how people vote. The room fell silent, people looked alarmed.

It is only when spiders are seen that they scare us.

Despite half the world being online with three billion people using social media there has been little conversation on the collection, availability and use of personal data in popular culture. Tick-box-click-accept conditions are no more than packaging torn off to get to the product; informed consent is confined to the bygone days of people on street corners with clipboards.

The extraordinary exposé in the Guardian helps but this issue is bigger than Facebook and Cambridge Analytica. It’s more than the present, as we approach a data-reliant future of biometric identification, artificial intelligence and the internet of things. It is about us, our relationship with technology, and expectations for society.

Firstly, technology is not the problem. It is a means to an end. A tool that enables us to progress or decline, to empower or to exploit. To give due consideration to future social, economic, political and environmental ramifications, or not.

If only due consideration had been applied prior to the mass production of non-biodegradable plastics – or civil liberties prior to enabling the aggregation of vast quantities of personal data. Yet we drive towards our future with eyes fixed firmly on the rear-view mirror.

Some commentators have declared social media a “new phenomenon” with regulators struggling to keep pace. Nonsense. YouTube is 13 years old. Facebook is 14 years old. Google is 20 years old.

In 1895 there were 14 cars in England. By 1910 there were 100 000 cars. Surely such extraordinary disruption would have delayed appropriate legislation? The Motor Car Act was introduced in 1903.

So why the inaction? In part it is because many view data as intangible and esoteric. I struggle to accept this. Stocks and shares close every news bulletin, yet we cannot hold them in our hand. Enough people bought bitcoin to make the digital currency the fastest growing asset in the world last year. Apparently,a digital currency in which encryption techniques are used to regulate the generation of units of currency and verify the transfer of funds” is somehow less esoteric, less intangible.

The real problem is two-fold.

Despite its importance, there has been no public information campaign on data. Even the former Australian attorney general struggled to explain metadata when making the case for greater access to it. There has been no campaign on the different types of data as defined by the Open Data Institute: closed, shared, and open. Not all data is the same.

Nor is there a notion of it being co-created. The debate on data ownership with only improve when we turn to each other and realise multiple parties are involved in its creation.

Instead we are conditioned to view data as a threat. It is the breach, the cyber-attack, the tool of the thought police. It is something we give to government for a driving license which they then use for surveillance, without a pause to ask us if this is OK. So, societies’ response to data is to put up walls, to control, to close, to #deletefacebook. Yet paradoxically, data can be viewed as the opposite: an asset, an opportunity, co-created with walls removed, datasets connected, analysis undertaken, and lives improved. The starting point for data must first be to try and make it open.

Secondly, our system of governance is slowly evolving with power being distributed from the hands of the few to the hands of the many – vertical to horizontal. The trust that once sustained traditional hierarchies is gradually being transferred to each other, and the tools to publish and promote views are literally in our hands. We have the power to leave the European Union. We have the power to elect Donald Trump. Yet we do not have the governance in place to ensure these tools are fit for purpose and free from manipulation.

As wearable, wristbands and smartphones become more common, it will be individuals who have health data, which collectively could improve our health system beyond recognition; it will be individuals who have mobility data, to help plan our cities in real time. Yet without structures and related obligations in place, it is likely this data will be siloed for commercial or other reasons, and not for a broader societal good. This is why there must be complete societal reform and why we need a new social contract.

Plato, Socrates, Rousseau, Hobbes, Locke – great philosophers understood the moral and political obligations, written and unwritten rules, that form a society in the real world. Yet there are obligations in the digital world too. Hobbes in particular hypothesised about anarchic life prior to social order called a “state of nature” where the freedom to take meant others could take what you had, leading to fear and distrust. Today we are living through a “state of data.”

A lack of moral obligations surrounding the use of personal data is emphasised by Cambridge Analytica showcasing their work at conferences and world leaders retrospectively scrambling to distance themselves from the firm.

Yet it’s not too late to build a new social contract for the digital age to cover rights, responsibilities and expectations – explicit rules seeking to protect and empower all, like the European Union’s General Data Protection Regulation coming into force in May. Implicit rules based on better understanding that would enable us to respect each other’s digital space in much the same way we tacitly respect personal space. Where all parties fully understand the value of data and the deal into which they are entering.

We need a social contract that enables us to knowingly contribute our data to something bigger than ourselves, where political, industry and civic leaders maintain the debate or support a campaign on data awareness long after the dust has settled on Facebook and Cambridge Analytica.

In much the same way as we look left and look right before crossing the road, we should pause prior to providing our data without a tacit understanding as to how it is to be used.

This is the conversation the world has to have as the value of data is intrinsically linked to the value we place in each other. Only a new social contract that encompasses digital will enable us to fulfil its potential and expand the definition of us, strengthen democracy, and ultimately improve lives.

Kevin Keith is a writer, speaker, urbanist, and company director of GovHack

Most viewed

Most viewed