Skip to main contentSkip to navigationSkip to navigation
Google – the European court of justice has ‘also decided that search engines
The European court of justice has ‘also decided that search engines don't qualify for ‘journalistic exception'.' Photograph: Eduardo Munoz/Reuters
The European court of justice has ‘also decided that search engines don't qualify for ‘journalistic exception'.' Photograph: Eduardo Munoz/Reuters

We need to talk about the right to be forgotten

This article is more than 9 years old
After the European court ruling, we at Google want to encourage debate on where the public interest lies in restricting web searches

When you search online there's an unwritten assumption that you'll get an instant answer, as well as additional information if you need to dig deeper. This is all possible because of two decades' worth of investment and innovation by many different companies. Today, however, search engines across Europe face a new challenge – one we've had just two months to get our heads around. That challenge is figuring out what information we must deliberately omit from our results, following a ruling from the European Union's court of justice.

In the past we've restricted the removals we make from search to a very short list. It includes information deemed illegal by a court (such as defamation), pirated content (once we're notified by the right's holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (such as material that glorifies Nazism in Germany).

We've taken this approach because, as article 19 of the Universal Declaration of Human Rights states: "Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers."

But the European court found that people have the right to ask for information to be removed from search results that include their names if it is "inadequate, irrelevant or no longer relevant, or excessive". In deciding what to remove search engines must also have regard to the public interest. These are, of course, very vague and subjective tests.

The court also decided that search engines don't qualify for a "journalistic exception". This means that the Guardian could have an article on its website about an individual that's perfectly legal, but we might not legally be able to show links to it in our results when you search for that person's name. It's a bit like saying the book can stay in the library but cannot be included in the library's card catalogue.

It's for these reasons that we disagree with the ruling. That said, we obviously respect the court's authority and are doing our very best to comply quickly and responsibly. It's a huge task, as we've had over 70,000 take-down requests covering 250,000 web pages since May. So we now have a team of people reviewing each application individually, in most cases with limited information and almost no context.

The examples we've seen so far highlight the difficult value judgments search engines and European society now face: former politicians wanting posts removed that criticise their policies in office; serious, violent criminals asking for articles about their crimes to be deleted; bad reviews for professionals like architects and teachers; comments that people have written themselves (and now regret). In each case someone wants the information hidden, while others might argue that it should be out in the open.

When it comes to determining what's in the public interest, we're taking into account a number of factors. These include whether the information relates to a politician, celebrity or other public figure; if the material comes from a reputable news source, and how recent it is; whether it involves political speech; questions of professional conduct that might be relevant to consumers; the involvement of criminal convictions that are not yet "spent"; and if the information is being published by a government. But these will always be difficult and debatable judgments.

We're also doing our best to be transparent about removals: for instance, we're informing websites when one of their pages has been removed. But we cannot be specific about why we have removed the information, because that could violate an individual's privacy rights under the court's decision.

Of course, only two months in our process is still very much a work in progress. It's why we incorrectly removed links to some articles last week (they've since been reinstated). But the good news is that the ongoing, active debate that's happening will inform the development of our principles, policies and practices – in particular about how to balance one person's right to privacy with another's right to know.

That's why we have also set up an advisory council of experts, the final membership of which we are announcing tomorrow. These external experts from the worlds of academia, the media, data protection, civil society and the tech sector are serving as independent advisers to Google. The council will be asking for evidence and recommendations from different groups, and will hold public meetings this autumn across Europe to examine these issues more deeply.

The experts' public report will include recommendations for particularly difficult removal requests (such as criminal convictions); thoughts on the implications of the court's decision for European internet users, news publishers, search engines and others; and procedural steps that could improve accountability and transparency for websites and citizens.

The issues at stake here are important and difficult, but we're committed to complying with the court's decision. Indeed, it's hard not to empathise with some of the requests that we've seen – from the man who asked that we do not show a news article saying that he had been questioned in connection with a crime (he's able to demonstrate that he was never charged) to the mother who requested that we remove news articles for her daughter's name as she had been the victim of abuse.

It's a complex issue, with no easy answers. So a robust debate is both welcome and necessary as, on this issue at least, no search engine has an instant or perfect answer.

More on this story

More on this story

  • Google down as search services suffer intermittent outage

  • Google admits to errors over Guardian 'right to be forgotten' link deletions

  • Google 'learning as we go' in row over right to be forgotten

  • Things to remember about Google and the right to be forgotten

  • UK news organisations criticise Google over implementation of new law

  • Google faces deluge of requests to wipe details from search index

  • UK commissioner expects Google 'right to be forgotten' removals complaints

  • Don't hide your dark side from Google. Much better to tell all

  • EU court backs 'right to be forgotten': Google must amend results on request

  • Explaining the 'right to be forgotten' – the newest cultural shibboleth

Most viewed

Most viewed