This article is more than 1 year old

Here is how Google handles Right To Be Forgotten requests

Software engineer? Lawyer? Not a lawyer, even? Sure, have a go

RTBF trial Google allows software engineers, as well as its dedicated Right To Be Forgotten (RTBF) operatives, to make decisions about which search results ought to be deleted on request – and places such requests onto its internal bug-handling systems.

Behind-the-scenes details of how the "online advertising technology" company handles RTBF requests emerged during the Right To Be Forgotten trials at the High Court in London, England.

RTBF requests are made by people who want search results about them deleted, or "delisted" in the legal jargon. The "right" was created by an EU court in 2014 after an aggrieved Spanish man demanded Google deleted search results that highlighted his past misdeeds (PDF).

Once you submit an RTBF form via Google's website it disappears into a cloudy haze until you get a reply saying "yea" or "nay", as ad tech firm-aligned blog Search Engine Land sets out at unnecessary length.

Today The Register can reveal not only the processes Google uses internally, but also what happened in two real-world RTBF requests, as made by pseudonymous businessmen NT1 and NT2, who are both suing Google to demand the deletion of search results that mention their past criminal convictions.

No magical AI-machine learning nonsense here: humans only

Witness statements submitted by Google "legal specialist" Stephanie Caro (who admitted: "I am not by training a lawyer") for both trials explained: "The process of dealing with each delisting request is not automated – it involves individual consideration of each request and involves human judgement. Without such an individual assessment, the procedure put in place by Google would be open to substantial abuse, with the prospect of individuals, or indeed businesses, seeking to suppress search results for illegitimate reasons."

That "individual consideration", according to Caro's statement, involves people within the ad tech company looking at claims made by RTBF-ers and deciding whether or not to delete search results. Things that Google takes into account are "the nature of the offence, the sentence imposed, the time which has passed since the conviction, and the relevance of the information to the requestor's business or professional life".

The Chocolate Factory also takes into account "the type of website at which the information/URL appears". If a government of a European Economic Area member country has named you online as a convicted criminal, hard luck.

But don't fall into the trap of thinking that there are clear rules within Google for carrying out this exercise. According to Caro, who was involved in drawing up Google's processes for handling RTBF requests, "the adoption and use of prescriptive rules would allow no room for the exercise of judgment in the balancing of competing considerations and legal rights".

Google also have lawyers on standby to scrutinise RTBF requests as well. Just in case those definitely-not-prescriptive rules aren't clear enough, you see.

How long does it take? What does Google actually do?

Who within Google reads right-to-be-forgotten requests and decides what to do with them?

In the cases of both NT1 and NT2, their requests were logged on Google's internal case management and correspondence tool, imaginatively named Cases. Once the request is on Cases, someone from Google's "removals team" starts pondering the info you submitted.

Yet that isn't all that happens. "Certain matters were also considered outside the removals team" for both NT1 and NT2.

NT2 had skipped the web form and sicced his lawyers, notorious London attack dogs Carter-Ruck, straight onto Google via a letter, effectively short-circuiting Google's internal RTBF process. This caused his request to be immediately escalated to "a qualified lawyer and Product Counsel" who then replied directly to Carter-Ruck.

NT1, meanwhile, used the form.

Over a period of three months, various Googlers batted NT1's request back and forth. Members of the Legal Removals team looked at it. They sent the request to software engineers, who logged it as a bug on Google's internal Buganizer system. Caro explained: "For convenience, Google also uses the Buganizer tool as a way of managing workflow in relation to some referrals outside the removals team," noting that "the referrals are therefore not, strictly speaking, 'bugs'."

Once two of the URLs that NT1 asked to be deleted from Google search (including one from a government website) had been approved for deletion by someone in the Legal Removals team, the "bug" was passed on to two people in Google's Trust and Safety team. That team's function, Caro wrote, was "ensuring online safety by fighting web abuse and fraud" but they had been drafted in to help because of a high volume of takedown demands, "assisting the engineering team in reviewing [RTBF] referrals outside the removals team".

Not all went to plan, however. The Buganizer record showed that the government site search result that NT1 wanted removed was knocked back to the Legal Removals team by a software engineer. No further information about the URL was recorded on the Buganizer system, according to Caro's statement, suggesting the Legal Removals team "appears to have agreed with" the engineer that it ought not to be deleted.

Moreover, when Legal Removals got round to deciding two more search results should be deleted, another Google software engineer "assigned the bug to himself" before handing it straight back to Legal Removals, along with the comment "serious professional wrongdoing", commenting on the contents of the webpage. Three weeks later the bug was closed without further action – and Google wrote to NT1 telling him it had not delisted that particular result.

NT1 v Google Inc and NT2 v Google Inc have both concluded. Judgment in both cases is expected at the end of this month. ®

More about

TIP US OFF

Send us news


Other stories you might like