Court defines its territorial reach and obliges Google to remove “old” unwanted personal data from web search results
In a landmark ruling on 13 May 2014,1 the Court of Justice of the European Union (“CJEU”), the EU’s highest court, sitting as a “grand chamber” of 13 judges, found that Google was obliged to remove from its search engine databases the personal data of an individual even though that information was accurate and not “prejudicial.” The decision will have wide-reaching ramifications not only for search engines but also for other businesses operating on the internet.
In short, the CJEU found that Google Inc. (“Google U.S.”):
Was subject to Spanish data protection law, despite it being a US company with no direct presence in Spain, because it had a sales and marketing affiliate operating in Spain and the activities of those two entities were sufficiently close;
Was “processing” personal data in collecting information containing information about individuals and providing search results in response to searches on names; and
Was therefore subject to EU rules on “erasure” and “cessation” of processing, with the result that individuals have the right to require Google to remove personal data about them which is old (even if true) or is otherwise incompatible with European data protection requirements.
The case must also be considered in the light of proposals to enshrine a to-be-defined “Right to be Forgotten” within wider reform of EU data protection law. In its surprising judgment, the CJEU has in effect stated that that right already exists.
Unusually, the CJEU did not follow the advice of its Advocate General, who in 2013 gave an opinion favorable to Google.
The Spanish data protection law that was at issue here stems from the European data protection directive (the “Directive”) and the judgment will therefore have relevance throughout Europe.
The judgment considered a number of fundamental concepts which appear in the Directive, some for the first time (at this level of court). What is meant by “processing” personal data? Who in the context of a search engine is a “data controller?" When is a data controller “established” within a European member state?
The Factual Background
In 1998, a Spanish newspaper, La Vanguardia, published two articles that mentioned a Mr. Costeja González. The article concerned proceedings for the recovery of social security debts. A Google search against Mr. González’s name resulted in links to archived copies of these two articles being displayed. In 2010, Mr. González lodged a complaint with the Spanish data protection authority, the Agencia Española de Protección de Datos (AEPD), against both the newspaper and against Google U.S. (the operator of the search engine) and its Spanish subsidiary (“Google Spain”). Mr. González said that the proceedings concerning him had been fully resolved and the references to them were now irrelevant.
He required the newspaper to remove or alter those pages so that the personal data relating to him no longer appeared. He also said that Google should be required to remove or conceal the personal data relating to him so that they ceased to be included in the search results and no longer appeared in the links to La Vanguardia.
The AEPD rejected the complaint in so far as it related to La Vanguardia, taking the view that the publication by it of the information in question was legally justified.
But the AEPD upheld the complaint against Google and ordered the removal of the data. Google appealed through the Spanish courts, which referred to the CJEU certain questions as to how certain fundamental provisions of the Directive should be interpreted. Now that the CJEU has answered these questions, the Spanish courts can give a final judgment on the particular matter in front of it.
Was Google “processing” personal data and was it a “controller?”
The first question that arose was whether the activity of a search engine (finding information containing personal data published or placed on the internet, indexing it automatically, storing it and then making it available in response to search requests) was properly classified as “processing” of personal data within the meaning of the Directive.
Google argued that since all information is indexed without distinction between “personal data” and other information, there was no “processing.” Moreover, since Google had no knowledge of the data, even if they were processing personal data, Google did not have enough “control” to be a “controller.”
The CJEU disagreed. The definition of “processing” in the Directive includes the “collection” of personal data as well as any subsequent “retrieval,” “disclosure” or “making available” of those data. These are acts which, according to the CJEU, Google carries out when operating its search functionality. The facts that it also carries out these activities for non-personal information and that the personal information was already available on the internet were irrelevant.
Moreover, Google was a controller. The Directive defines “controller” as the person “which alone or jointly with others determines the purposes and means of the processing of personal data.” The Court felt that these words were clear and that here it was Google that “controlled” the data by building its search database.
The Google search service is offered through a number of domain names throughout the world. Although the Google group has a Spanish member, even the Spanish language version of the search engine (www.google.es) is operated by Google U.S.
Google is funded through a range of advertising services which advertisers can utilize (sponsored advertisings, keyword acquisition and so on). Google Spain promotes and effects the sale of these on-line advertising products and services which fund the search service.
Article 4 of the Directive sets out when a member state’s laws are to apply. The provision under discussion in this case was the requirement that, for the member state’s law to apply to an act of processing, the processing must be:
“carried out in the context of the activities of an establishment of the controller on the territory of the Member State.”
The issue that arose therefore, since it was Google U.S. that carried on the search function, was whether the existence of Google Spain (and its role as described) constituted an “establishment” in the context of which the search function is carried out.
Motivated by a policy desire to placate European privacy concerns, the CJEU was in favor of a broad territorial scope and interpretation of this important provision.
The CJEU found that Google Spain was certainly in itself “established” in Spain. The only real issue was whether that then meant that the “processing” carried out by Google U.S. (in compiling and operating its search engine) was “carried out in the context of the activities” of Google Spain’s establishment in Spain. It was important to note that the jurisdictional test did not require the processing to be carried out “by” the establishment concerned (i.e. Google Spain), but only that it be carried out “in the context of the activities” of that establishment.
Since Google Spain promotes and sells in Spain advertising space offered by Google U.S., the activities of the two were “inextricably linked” (not only does one fund the other but the displays of the search requests served up by Google U.S. are accompanied by advertising sold by Google Spain).
Google U.S.’s processing therefore was subject to the jurisdiction of Spanish data protection law (and by extension of the data protection law in all other member states of the EU in which Google has an affiliate undertaking similar activities to Google Spain).
The Right to be Forgotten
Having established the jurisdictional requirements, the CJEU then went on to consider the most difficult and controversial issue: whether Mr. González could insist upon the removal of the personal data shown up in search results.
There were two relevant articles in the Directive.
First, a right of correction or erasure. Article 12(b) of the Directive provides that the individual has the right to obtain from the data controller “as appropriate the rectification, erasure or blocking of data the processing of which does not comply with the provisions of this Directive, in particular because of the incomplete or inaccurate nature of the data.”
Then, a right to object to processing. Article 14(a) of the Directive provides that the individual has (in relevant circumstances) the right “to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation. Where there is a justified objection, the processing instigated by the controller may no longer involve those data.”
Google (and on this point, interestingly, the European Commission agreed with Google) argued that the rights set out in Article 12(b) and Article 14(a) of the Directive only applied if the processing is incompatible with the directive or on compelling legitimate grounds. These rights did not arise merely because an individual might “consider that that processing may be prejudicial to them or they wish that the data being processed sink into oblivion.”
The right to require rectification arises, noted the CJEU, not only when data are inaccurate but also when they are “inadequate, irrelevant or excessive in relation to the purposes of the processing, that they are not kept up to date, or that they are kept for longer than is necessary” (this principle is sometimes referred to as the “data quality principle”). It followed from those requirements that even initially lawful processing of accurate data may, in the course of time, become incompatible with individual’s rights. As an application of this principle, if the result of a search against a name displays links to web pages published lawfully by third parties and containing true information, which is, at the time of the search, not compatible with the data quality principle, the information and links concerned in the list of results must be erased.
The data quality principle in play here overrode not only the economic interest of Google but also the interest of the general public in finding that information upon a search against the individual’s name. However, the court noted that that would not always be the case. For example, where the individual is a public figure, and there might be a public interest in wider dissemination, then that individual’s rights might not override the other interests at play.
In this case, the CJEU found that, given the sensitivity for Mr. González’s private life of the information contained in the archived press stories and to the fact that its initial publication had taken place 16 years earlier, Mr. González had established a right that that information should no longer be linked to his name by means of such a list. Since in this case there did not appear to be a particular public interest reason requiring a wide access to that information, Google should be required to remove those links from the list of results.
The Right to be Forgotten – The role of the “publisher”
The Court dealt with another point that Google had made. Google had argued that any request seeking the removal of information should be addressed to the operator of the website concerned. This publisher was responsible for making the information public and was best able to assess the lawfulness of publication and to take the information down from public view.
The Court felt strongly that it was not appropriate to require the actual publisher of the linked-to website to remove offending material before Google could be approached. Firstly, the language of the Directive did not support this. Secondly, it is possible that that website (although not in this particular case) would not itself be subject to European jurisdiction. Thirdly, the publishing website might actually be subject to an exemption (e.g. that which applies to journalists). Finally, the relevant Directive articles require an assessment of competing interests and it is possible that such an assessment might be adjudged differently when the data controller is the search engine from when the data controller is the linked-to website; the search engine is likely to constitute a more significant interference with privacy than the publication on the web page (which might not otherwise be easily found).
In short, it is possible for the rights of erasure/correction to be available as against a search engine but not available as against the original publisher. As such there was no requirement to make successful requests against the operator of linked-to pages before making them against Google.
Although this will not be good news for American companies operating in Europe, the confirmation of the territorial reach of the directive on the basis of an establishment of a local subsidiary of Google is consistent with the approach of regulators in recent times. It is not only search engines that are faced with this issue; certainly any internet business which has an advertising arm within Europe used to fund that business – in a similar manner to Google - would be within this reach (social media services, for example). However, this aspect of the decision may be even more far-reaching. Whilst it is clear that there must be some connection between the activity of the European entity and the activity of the non-European entity (here it was one funding the other), it is not too hard to see the logic being applied by European regulators to other facts and circumstances to grab jurisdiction.
The finding on the “right to be forgotten” will be very far-reaching and the press is already criticizing it. It is hard to disagree with the criticisms which in essence make two points: (i) it is impractical to require Google (or other search engines) to make these types of “corrections” or “erasures” and (ii) little weight has apparently been given to the rights of freedom of expression and the rights of the press. To many commentators, especially in the U.S., this case will be seen as demonstrating a widely held European view that speech and communications are subject to too great a level of control. This can be contrasted with the strongly enshrined American views on freedom of speech. As long ago as 1927, Justice Brandeis in the US Supreme Court said that, when facing the evils speech can cause, “the remedy to be applied is more speech, not enforced silence."2
In effect, this case means that search engines, and other web-based businesses, will be forced to take on the functions of a publisher, analyzing the content to materials to which they give access in order to respond to requests for erasure/correction.
Although the submissions in the case are not made public, the judgment makes clear that whilst the European Commission sided with Mr. González and the Spanish authority on the issue of jurisdiction (territorial scope and “processing” and “controller”), it sided with Google in relation to this substantive point. The European Commission would not have extended a “right to be forgotten” to these circumstances. Likewise, the earlier Attorney General (AG) opinion, which the court did not follow, weighed in favor of Google on both these grounds.
The legislation did allow some lee-way to the CJEU to find otherwise than it did (as the AG opinion makes clear). The Articles of the Directive applied here contain language such as “appropriate,” “justified objection” and an assessment of whether the “legitimate interests of the controller” were “overridden” by the rights of an individual, all of which could have been used to determine in favor of Google. Arguably, and perhaps most importantly, the Court gave no real regard in considering these fundamental questions to the freedom of expression.
However, there is no scope for further appeal and Google are likely now to be faced with (literally) thousands of requests to remove information from their search results. This is all the more significant in light of the fact that, as mentioned above, European data protection is in the process of being reformed -- and sanctions for violation of European data rules are likely to be substantially increased.3 The proposed Regulation contained a statutory right to be forgotten. Whilst it appeared at some point as if this might be quietly dropped (certainly, the UK was very critical of it), perhaps now there will be more of a debate (given the CJEU’s views as to its existence anyway) on the detail of the scope and on what exemptions should apply.
Lastly, it is early days, but the judgment already begs many unanswered questions. Here are just a few:
If an entity like Google U.S. is a controller in relation to its European activities, should that entity also be considered a controller in each European jurisdiction in which it has an entity carrying out functions similar to Google Spain's? Can an individual then forum shop for the most assertive regulator or amenable court?
Information that is 16 years old is too old, says the CJEU, but where is the cut-off? The answer is likely to be that this needs to be assessed on a case-by-case basis, but given the immense difficulty this would impose for search engines, whose very model is based on not reviewing content, a default and cautious response to the judgment might be for the search engine always to remove following any request.
If information is too “old”, then arguably it shouldn’t be processed at all (according to the “data quality principle”), but then hasn’t there already been a breach of that principle without even receiving a removal request?
More generally, if a search engine is a data controller, there are many more data protection requirements. For example, it needs to respond to “subject access requests” and it should provide certain information to the individual stating that data is being processed (Article 11 of the Directive). How does a search engine do all this?
Unless superseded by new legislative initiatives in the form of a Regulation or other revision to data protection law, it can be expected that search engines’ obligations under data protection law will reach the CJEU again before not too long.