Court rules on search engines and the right to be forgotten
The CJEU today handed down judgment in C-131/12 Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González. The ruling, on the application of the Data Protection Directive (Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data) to search engines has potentially far-reaching consequences for how individuals’ personal data is treated online.
Are Google’s activities covered by the Data Protection Directive?
The court found the activities that Google carries out to be covered by the wide definition of data processing under Article 2 of the Directive which includes, “whether by automatic means or not” such activities as collection, organisation, storage and dissemination of data. This was so even though the data is not altered by Google, it did not differentiate between personal data and other data, and it did not control the underlying data.
Google was also found to be a “data controller” even though it was not the original publisher of the data. In respect of the processing it carries out, it determines the purposes and means of doing so, and therefore is a controller. Joint data controllers are contemplated by the Directive, therefore the fact that the originating webpage may also be a data controller did not have a bearing on the position of Google.
Google argued that as the processing of data for the search engine was carried on outside the EU and its Spanish subsidiary only sold advertising space the Directive did not apply to the processing. However, the test in the Directive is whether the processing is carried out “in the context of the activities of an establishment in the territory of a Member State”. The court found that the selling of advertising that appears alongside search results is inextricably linked to the search results, and therefore the test fulfilled. This is a different test from that applicable in defamation and misuse of private information cases, where the test is whether the search engine is a “publisher”.
Removal of data
Data subjects have a right to request that data which is inaccurate, irrelevant or has become irrelevant be removed by the data controller. In assessing whether to do so, a balancing test must be carried out between the rights of the data subject (privacy, data protection), those of the data controller (economic interests) and the interests of internet users (being able to have access to the information, including the right to receive information under Art 10 ECHR). The court did not consider these opposing rights in any detail, with little analysis of the what weight should be given to the right of the public to access to complete records online. Rather the court found that in general the rights of the data subject will override the economic interests of the search engine and the interests of the general public. However, this would not be the case where there were particular reasons in play – for example where the individual in question plays a role in public life such that the right of the general public to access information about him or her outweighs the individual’s rights.
Material may have to be removed by a search engine even where (as in the case before the court) the source webpage had published it legally. Material may also become data which is irrelevant and therefore should be removed due to the passage of time.
Data subjects who want to request the removal of personal data which in their view is inaccurate or irrelevant should make the request of the search engine (and any other data controller who they wish to remove the data). If they do not resolve the matter the data subject may then complain to their national data protection authority – in the case of the UK, the Information Commissioner.
The issue of what should be removed and in particular when the public interest instead requires that material be left up, will doubtless be the subject of much argument. How search engines will respond to the ruling in practical terms also remains to be seen. It is clear that data protection is becoming an important tool through which individuals can seek to protect their privacy rights.
The court appeared greatly concerned about the additional impact that search results can have on an individual’s privacy and data protection rights as compared to the webpages themselves. The court took the view that as aggregation by search engine makes the data much more easily accessible, this means that any interference with a data subject’s privacy rights by the search engine is therefore much more significant than that by the publishing webpage. Whether a domestic court would have taken the same view is open to question. Whether search engines should be fixed with responsibility for the results they display is an ongoing issue not only in data protection, but in privacy and defamation (amongst other fields) also.
There is a new data protection regulation currently under negotiation at EU level. No doubt there will now be calls for this to address both the liability of search engines and the so-called “right to be forgotten” directly.
The ruling has received a mixed reaction from commentators. Google has said that it is disappointed with the ruling and regards it as censorship of the internet. Some have welcomed the recognition of the “right to be forgotten”. Others have expressed concerns about the effect the ruling may have on free speech and internet freedom, and whether this will result on different search results being displayed inside and outside the EU.
Telegraph – Google must delete your data if you ask EU rules