Are the days of unfettered harvesting and processing of personal data over?

Each year the International Bar Association runs an essay competition for junior advocates. Chambers encourages its pupils to have a go at the essay in the ‘Intellectual Property, Communications and Technology Law’ Section.

This year the question to be addressed was “Are the days of unfettered harvesting and processing of personal data over?” This wasn’t an area in which I had any litigation experience, but nonetheless it was an enjoyable challenge researching and writing an essay on what is a bit of a zeitgeist topic.

Sadly my entry wasn’t successful (to this day I don’t think I’ve ever even won a TV phone in) but I reproduce it below in case it is of interest. If you would like a PDF copy, then please do let me know. A lot has happened since this was written just a few months ago, and many of the conclusions about the joint effects of competition law and data protection remain solid.

This essay considers whether the recent introduction of the European General Data Protection Regulation has triggered a worldwide movement away from unfettered harvesting and processing of personal data.

Evidence indicates that consent for harvesting and processing activities is readily obtained and that even companies with poor privacy reputations are able to thrive. However, as our common understanding of the economic importance of personal data has improved, the potentially anti-competitive consequences of data control have been recognised.

Competition law acting in tandem with data protection law will provide tools by which consumers and regulators can control the harvesting and processing of data. This will counterintuitively allow new opportunities for the processing and sharing of personal data between consumers and companies. This new open data economy will be beneficial to consumers, if not to the incumbent technology companies, who may consider their activities fettered.

Each of us spends our entire day creating data. In the sensor rich environment in which we live the amount of data we produce is growing exponentially. Every contactless payment, social media post, every web search. Every minute of every day we are producing data, and the number of ways in which it can be harvested and processed increases.

The European General Data Protection Regulation entered into force on 25th May 2018. It promised a new era of consent-based data control. Other countries have begun to introduce similar regimes, for example the Brazilian General Data Protection Law and the amended Japanese Act on the Protection of Personal Information. New Zealand and Australia are also working towards similar legislation.

In the United States, California has introduced a Consumer Privacy Act that is softer than the GDPR but may help lead the US in a more privacy-focussed direction. Signs indicate a global movement towards the consent-based regulation of personal data[1]. EU ‘data imperialism’ is a short-hand term for this export of the European approach, imposed on other countries by the economic heft of the world’s largest trading bloc.

In conjunction with legislative developments there has been an increase in public awareness in relation to personal data. Data breaches and leaks have become front page news, in particular in connection with the Cambridge Analytica scandal, in which a Facebook app was used to collect data that was mis-used for the manipulation of elections. As a result, consumers are more aware than ever of how their data is used.

Are consumers exercising their rights under new laws on data protection? Immediately on the date of the GDPR’s entry into force a number of lobbying and pressure groups brought complaints to the regulators in relation to the conduct of some of the US tech giants.

One of the notable actors in these cases is NOYB: “None Of Your Business”. NOYB filed complaints on forced consent in Austria, Belgium, France, Germany and Ireland. Further complaints are being prepared in relation to misuse of the legitimate interest exception and ambiguous consent, as might be taken to occur by opt-out or mere use of a service.

The first of these complaints has already had profound consequences. In January Google were fined €50 million by the French National Data Protection Commission (“CNIL”). This fine was in response to complaints from NOYB and the French rights group La Quadrature du Net (“LQDN”) who were supported by more than ten thousand French citizens in protesting practices by Google. CNIL held that Google’s practices in advert personalisation failed to provide the user with sufficient information for them to give informed consent, and furthermore that the consent relied upon by Google was neither unambiguous nor specific.

Many other complaints have been filed across Europe. Annual data from the European Data Protection Board[2] indicate that data protection offices across the 31 EEA states handled a total of 206,326 cases in 2018, of which 94,622 were complaints[3]. CNIL’s Mathias Moulin is reported to have said that the introduction of the GDPR seems set to roughly double the number of data breaches reported[4]. The consequences of this vast number of reports are more muted. The total fines handed out to date amount to €56 million – including the Google fine discussed above.

What about direct consumer action? In the High Court in England a test case was brought in Lloyd v Google[5] which shows that there are still some obstacles to the direct enforcement of rights. The claim brought by Lloyd related to a consumer’s rights under the UK’s Data Protection Act 1998 (“DPA”). However, the decision has relevance to claims under Article 82 of the GDPR due to similarities in the language used and the intent of the legislation.

In Lloyd v Google the Judge held that the statutory meaning of ‘damage’ in the DPA did not include the consumer’s loss of their right to license the use of their data, analogously to what are commonly termed ‘negotiating damages’ in the UK. i.e. the negotiated amount the company would have to pay the consumer not to exercise their right to object. Furthermore, the Judge considered that the concept of a representative class action did not work procedurally, due to differences between the possible claims and defences that might be run. It remains to be seen if the Judge’s reasoning will be upheld on appeal. If it is, then there could be practical consequences for privacy litigation, in the UK at least.

To fully consider the fetters on the harvesting and processing of data we need to also think beyond purely legal limitations. The way companies use data has reputational consequences. What must be kept in mind is that a requirement for a user’s informed and explicit consent only takes data privacy as far as the user wants to take it.

In the wake of the Christchurch shootings in New Zealand the country’s Privacy Commissioner, John Edwards, described Facebook as “morally bankrupt pathological liars who enable genocide (Myanmar), and facilitate foreign undermining of democratic institutions”. How do people feel about Facebook controlling their data?

Recent studies show that the median individual would need to be paid $48 to give up Facebook for a month[6]. Recent news reports also claim that Facebook paid a group of trial app users $20 per month for unlimited access to activity on their smartphones, by installing an application that would let Facebook monitor all traffic[7], including “private messages in social media apps, chats from in instant messaging apps – including photos/videos sent to others, emails, web searches, web browsing activity, and even ongoing location information”.

It seems therefore that the average consumer is not strongly motivated by data privacy. This is the case despite consumer mistrust of technology companies, and low levels of understanding of how personal data can be used[8].

When consumers benefit from a product or service that they are receiving for a free or subsidised level they are happy to exchange their personal data. A 2018 report by the DMA[9] (the UK industry body for data and marketing companies) indicates that the proportion of people who saw their personal information as an asset that could be used to get a better deal on services rose from 40% to 56% between 2012 and 2017.

There are therefore clear limits to the fetters that can be placed on the harvesting and processing of data, if those fetters depend on user consent. Even when properly informed, users are willing to consent to intensive use and harvesting of their personal data. Perhaps however, there are some consumers that are voting with their feet, and moving to privacy conscious alternative software providers?

The privacy conscious messaging app Telegram reported more than 200 million monthly active users in March 2018, but has not updated its statistics since. Alternative privacy focussed social networking sites such as Diaspora, Path, App.net, Mastodon and Vero have seen only limited take up, none attracting more than one million users. In contrast, Facebook has more than 2.3 billion monthly active users, WhatsApp has 1.5 billion users, Instagram has 1 billion users, and Twitter has 320 million. Privacy conscious search app Duck Duck Go has received a record of 38.5 million hits in a day; Google processes 3.5 billion, daily. Privacy does not appear to be a key driver of consumer behaviour.

There are however alternative options for those with specific needs for privacy. The right to be forgotten derived from Google Spain v AEPD[10] is now codified in Article 17 of the GDPR. The pending case of Google v CNIL[11] will determine whether the scope of the right extends to top-level or overseas domains, expanding the European regulatory influence on the internet. Since May 2014 Google alone has received more than 795,000 requests for the delisting of URLs from its search results, with 88.7% of those coming from private individuals and not corporations.

People are therefore understandably interested in their privacy when they have specific needs, but perhaps not until then. The vast majority of users are happy to continue providing their data to the tech incumbents, despite reputational issues. The dominance of Google, Facebook et al. seems set to continue with continued harvesting and processing of personal data, at least to the extent possible with consumer consent. But do the scale advantages held by the big technology companies create their own problems?

In February 2019 the German competition authority (the Bundeskartellamt) released a decision[12] prohibiting Facebook from combining data from different sources. The Bundeskartellamt took issue with Facebook’s use of cookies on third party web pages and the use of its WhatsApp messaging service and Instagram network. Facebook’s terms were held to be an abuse of its dominant position. The Bundeskartellamt considered itself competent to take account of data protection considerations when assessing the possible abuse, stating that Facebook’s “terms and conditions are neither justified under data protection principles nor are they appropriate under competition law standards”.

The ruling required the substantial restriction of Facebook’s data harvesting and processing. What Facebook remained able to do would have to be under significantly more explicit consent from its users. The competition implications of the harvesting and processing of personal data will only increase in importance. As the Bundeskartellamt says: “Social networks are data-driven products. Where access to the personal data of users is essential for the market position of a company, the question of how that company handles the personal data of its users is not only relevant for data protection authorities, but also for competition authorities”.

This should be a matter of major concern for the major technology companies. In Europe, competition law has a highly proactive regulatory body with a penchant for levying hefty fines. Google alone has been fined more than €8 billion by the European Commission for anticompetitive practices. If the Commission follow the lead of the Bundeskartellamt and recognise the importance of personal data in considering abusive behaviour under Article 102 of the Treaty on the Functioning of the EU then it may force technology companies to silo data and restrict processing of personal data. This may be the case even where customers might otherwise be willing to consent to the activity.

The increasing relevance of consumer data to competition has also been recognised in the UK by a UK government report titled “Unlocking Digital Competition: Report of the Digital Competition Expert Panel”. In their report, the panel provide strategic recommendations for managing dominant companies and protecting consumer welfare in the provision of digital services. The panel recognises that “Economies of scale and scope appear to be particularly strong in relation to the accumulation and use of data relating to consumer behaviour.”

The current ecosystem of data, as it revolves around the major US technology players therefore looks likely to be constrained by the tandem action of data protection and competition law principles, even where consumers remain ambivalent.

The EU’s introduction of the GDPR must also be seen as an important international political statement regarding the EU’s stance on foreign technology companies. The EU has sought to benefit its citizens but also EU-based enterprises, who have never been leaders in the technology space. The introduction of the GDPR had a huge negative effect on the value of consent-less personal datasets, i.e. Those previously collected without adequate information being given to the data subjects. This will have to some extent reduced the first mover advantage that was in the hands of overseas technology companies. A further step in this process may be the introduction of open standard data portability requirements.

The UK’s Unlocking Digital Competition report recommends the introduction of open standards or formats for data use and portability, that would allow consumers to move their data freely between competing systems. Current data portability requirements merely ensure data can be downloaded, putting the onus on the consumer to find ways to transfer their data to another provider in a compatible format. Open standards may be the key to allowing users to migrate from incumbent providers to more privacy focussed alternatives, as easily as moving money between banks. Open standards would also help break down the positive network effect which currently benefits incumbent data controllers.

Data portability may also open up new business opportunities in the harvesting and processing of data. The UK report uses the example of Transport for London, the government entity responsible for London public transport, providing open access to anonymised data from its users. This is stated to have created up to £130 million a year in follow-on benefits, for example from third party applications such as the Citymapper journey planner.

Looking again to the title of this essay, the use of the word fetter implies an unwarranted negativity. It implies that the harvesting and processing of personal data is to be restrained needlessly. There is however nothing wrong with the harvesting and use of personal data. Personal data use in medicine, for example, has the potential to fast-track the introduction of new medicines and create effective new diagnostic techniques based on patient data. This is not something that will be fettered, simply controlled.

In many cases exploitation of personal data has the potential to immeasurably improve consumer welfare. As a simple example, think of clothes fitting. Online sales of clothes can be streamlined by combining consumer’s data on the fit of past items with comparators from other consumers who have worn the same items to make predictions as to which items from a new brand will fit a consumer well. This reduces returns and therefore also the environmental impact of online shopping. Such an approach is not possible without use of personal data. Under the legal regime, there is no reason such activities need to be fettered.

To conclude, it is a matter of fact that the collection and use of personal data by companies in the technology sector has fallen short of desirable standards. There is no reason this should continue. With proper regulation by data protection and competition law the harvesting and processing of data can continue to increase exponentially. There is immense potential for benefit. All that is required is adherence to standards. The GDPR may prove to have been one of the vital early steps in the creation of a controlled and beneficial data economy in Europe, that may yet be exported around the globe.

The days of unfettered harvesting and processing of personal data may be over, but the start of the data age is just beginning.


[1] There are notable outliers. The situation in China of course differs greatly from western democracies. From a privacy perspective the social ranking algorithms and state harvesting and processing of personal data in China is deeply concerning.

[2] European Data Protection Board Annual Report 2018 (link)

[3] Some of these will however relate to failure to adequately comply with subject access requests.

[4] Theregister.co.uk “Year 1 of GDPR: Over 200,000 cases reported, firms fined €56 meeelli… Oh, that’s mostly Google”, accessed at 14 April 2019

[5] [2018] EWHC 2599 (QB) (bailii)

[6] Using massive online choice experiments to measure changes in well-being, Erik Brynjolfsson et al. PNAS April 9, 2019 116 (15) 7250-7255; first published March 26, 2019 (link)

[7] See report by TechCrunch titled “Facebook pays teens to install VPN that spies on them

[8] Marketing Week reports that 73% of respondents, to a 2018 survey by OnePoll for the Chartered Institute of Marketing (CIM), distrust social media sites, and only 6% “mostly” understand how their data is used.

[9] “Data privacy:  What the consumer really thinks”, DMA, Feb 2018 (link)

[10] C-131/12

[11] C-507/17; Note that Advocate General Szpunar has advised the court to find against global effect.

[12] Under section 19(1) of the German Competition Act

Leave a Reply

Your email address will not be published. Required fields are marked *