Three years ago, the European Court of Justice gave judgment in the Google Spain-case, which established the so-called ‘right to be forgotten.’ This right enables individuals to require from search engines that they remove irrelevant search results for searches on their name.
I have co-authored several articles and also a book chapter on the right to be forgotten. One of these articles deals with how the right to be forgotten has been received in The Netherlands. Frederik Zuiderveen Borgesius and I may have to write an update since some uncertainty arose in Dutch case law about how the right to be forgotten applies in cases concerning sensitive personal data. Also, the Dutch Supreme Court has recently ruled on the right to be forgotten. In its decision, the Dutch Supreme Court essentially held that the right to privacy, as a rule, overrides the interests of the search engine and the interest of internet users searching for information. This ruling is in fact a restatement of what the European Court of Justice had already ruled in its Google Spain decision.
Many Dutch individuals have exercised their right to be forgotten. In the Netherlands, Google has received more than 32.500 requests regarding almost 114.000 URLs. About 46% of those URLs have been removed by Google. The reason for writing this blog is that the Dutch Data Protection Authority (Autoriteit Persoonsgegevens) has released new data about the cases it considered. This newly released data concerns cases in which the search engine denied to delist a search result and the DPA acted as a mediator.
If a search engine operator refuses to delist a search result, individuals can request the Dutch DPA to act as a mediator. Since the Google Spain-decision, the Dutch DPA has received requests from 155 individuals. Although this may seem like a high number of people, they only represent a fraction of the people whose removal requests have been denied by a search engine.
In 70 cases, the Dutch DPA decided not to mediate either because (1) the search engine did not clearly violate Dutch data protection law, (2) the facts of the case were unclear, or (3) there were ongoing legal proceedings.
In 52 cases, the Dutch DPA did act as a mediator between Google and an individual, and in two cases it mediated between Microsoft Bing and an individual. In 37 of those cases, search results were eventually removed. In 14 cases, Google stood by its decision not to remove the search results. One case related to Bing is still in process. If you crunch the numbers, you will find that in about 24% of the 155 cases, the search engine (Google) came back from its decision not to remove a particular result.
In the report there is also a list of categories of cases in which the DPA decided not to mediate. In these cases the Dutch DPA thus seems to agree with the search engine’s refusal to remove the results. This is an interesting list because it shows where the the Dutch DPA draws the line, but also how some may try to use (or misuse) the right to be forgotten in an attempt to conceal inconvenient information. The DPA did not mediate in cases concerning:
- Controversial statements by politicians, including former politicians, if the information was not older than 4 years, or longer if the search results related to still relevant political history.
- The behavior of top executives or people with high managerial responsibilities in companies or organizations, as well as accounts of the wealth of very rich people (provided that the information was not obviously false or factually incorrect).
- (Medical) disciplinary sanctions and convictions for serious criminal offenses.
- Very complex cases in which proceedings are still ongoing and which deal with allegations, convictions, and punishment of criminal conduct or fraud by individuals playing a role in public life, and in which the Dutch DPA was unable to assess whether the information in question was accurate.
- Inconvenient, but not incorrect or out-dated information about people playing a role in public life, especially information that has been made public by these people themselves.
- Defamatory information, provided that the information was not obviously false or untrue.
- Search results that appear when someone searches for a word, an address, or a telephone number.
What can we tell from this list? Well, a couple things.
Firstly, opponents of the right to be forgotten have often pointed out that the right can also be misused to remove relevant information from the public’s eye. Based on this list, we can assume that there have been Dutch politicians or former politicians who have tried to use the right to be forgotten to remove inconvenient information about themselves. This is of course not what the right to be forgotten was created for. As search engines are primarily responsible for removal decisions, we can only hope that they take their decisions about the removal of information with care. The list released by the Dutch DPA underscores this point.
Secondly, the Dutch DPA gives an overview in its report of the body of case law concerning sensitive data, however inconsistent it may be. The Dutch DPA does not take a clear stance on whether removal requests concerning sensitive data should be treated differently. However, from the above list it is clear that the Dutch DPA is of the opinion that information about serious criminal convictions, which could be considered sensitive data, should not be delisted if people search for a criminal’s name. This seems to suggest that the Dutch DPA does not treat information about crimes any different from non-sensitive personal data.
And finally, the Dutch DPA does not want to be involved in assessments of whether a particular piece of information is defamatory. Making determinations about the defamatory nature of information is essentially left to the court. Such information should not be removed on the basis of the right to be forgotten, unless it is very clear that the information is incorrect.
The Dutch DPA’s report is available here (in Dutch).
This contribution has been crossposted at Stefan Kulk’s blog.