Research Matters

The UK’s exit from the EU – Data protection, adequacy and divergence

An abstract image depicting GDPR

This article examines the UK’s exit from the EU in the context of data protection, and highlights the considerations that will apply as the UK considers the costs, benefits and complications of divergence from EU legislation, and how these might affect Northern Ireland.

In exiting the EU, the UK gave itself the ability to depart from a vast array of legislative and policy structures which underpin much of our daily lives. In doing so, it raised the question of how such divergence would be managed, and what the implications would be for trade, devolution, and a host of other questions which have yet to be fully explored. These structures range from the essential to the esoteric and, when it comes to data protection, it manages to be both.

What is data protection?

Data protection is a relatively new legal and regulatory concept, because it’s only relatively recently that there has been enough data to pose any kind of legal or regulatory issue. Often misunderstood as dealing solely with privacy issues, data protection is about the management of personal data. Personal data is anything that, taken alone or with other information, can identify a living individual. A fingerprint can be personal data, as can an IP address or an account number. Encrypted data, aggregated data, even ‘anonymised’ data, can be personal data, as long as the person or organisation holding the data can use that data to identify a person. When we talk about data protection, we talk about the rules that apply to personal data throughout the data lifecycle – creation (or collection), use, storage, archiving and, finally, deletion. Some of these rules are to do with privacy, but really they deal with the use and abuse of data by organisations. However, privacy is a useful shorthand and, especially in the US, has become a catch-all term for data protection issues.

How do the UK and the EU differ on data protection?

As things stand at the moment, they don’t. The UK adopted the Data Protection Directive of 1995 as the Data Protection Act 1998. This legislation was updated by the General Data Protection Regulation 2016, which became (in the UK) the Data Protection Act of 2018. Also incorporated into UK law was the Law Enforcement Directive (LED), which managed the use of data by the police and other enforcement bodies. Previously, the UK had adopted the Privacy and Electronic Communications Regulations (PECR), which are popularly known as the “cookie law”. As can be imagined, the General Data Protection Regulation (or GDPR) deals with how personal data is managed generally, whereas legislation like PECR or LED deals with how personal data is managed in specific situations.

How has the UK’s exit from the EU affected data protection?

So far, things have continued as they were. On 21st February 2021, a draft adequacy decision in respect of the UK was submitted to the European Commission for approval. Once approved, this will mean that the UK’s data protection regime is considered ‘adequate’ by EU standards, in that it affords an equivalent level of protection for people’s data protection rights. This is important because the GDPR has what’s known as an extra-territorial effect, which means (broadly) that a business in a third country processing the data of EU data subjects (living people in the EU, not just EU citizens) must comply with EU standards. Once the UK left the EU, it became a third country, and this means that it’s more difficult to process the data of EU citizens, unless an adequacy decision is granted. Adequacy decisions mean “data can flow from the EU to that third country without any further safeguard being necessary”.

It is worth noting that the process of granting adequacy to the UK was undertaken quite quickly. With Japan, it took over two years and 300 hours of negotiation. Other than the UK, the fastest adequacy decision was Argentina, which took 18 months. In the US, adequacy has had a more complex journey, which is discussed below.

Businesses in the UK were concerned that, if adequacy took a long time, they might struggle to process the data of EU customers. This is obviously more of a concern for the services and e-commerce industries than it is for traditional trade in goods, but all businesses process personal data. In Northern Ireland a particularly complex situation could have arisen because of the number of North/South bodies which exchange data, as well as the number of citizens who work and live either side of the border.

Adequacy and the future

For the moment, the UK and the EU are aligned on data protection standards, which simplifies the exchange of data. Ultimately, the aim of the Brexit project was to allow the UK to set its own rules and, in an emerging market like data, the standard of regulation will be subject to a number of pressures. Data is big business, and regulation of that business costs money. Rules against automatic decision making or the right to be forgotten are likely to become more important for individuals as time goes on, but incursions by big data into fields such as health, politics and education have wider social implications. Notably, the UK’s adequacy decision has to be renewed every four years.

On the other hand, the UK and the EU have agreed to “work together to promote high international standards” and the Political Declaration which accompanied the Withdrawal Agreement mentions both parties’ commitment to ensuring a high level of personal data protection.

Does personal data matter to people?

A 2019 poll by Pew Research showed that Americans feel their data is less secure, that they are subject to surveillance by organisations, but that it is impossible to live without data being collected. The vast majority (78%) of those polled did not understand how, or why, this was happening. Unsurprisingly, few read privacy policies as a matter of course, but 81% of those polled feared more harm than good would come from their data being collected. Similar trends were found in a 2017 poll conducted on behalf of the UK regulator – the Information Commissioner’s Office. This could be summarised as: the public know they’re being watched, they don’t know how, they don’t really understand why, and the voluminous privacy notices and cookie walls with which they are presented are either confusing or ignored entirely. Some think this is a deliberate policy, designed to overwhelm individuals so they simply go along with the collection and use of their information, despite regulation designed to give people more control over their personal data.

This raises an important point, namely that the uses to which data can be put, and the ways in which it is collected, develop at a speed which far outstrips that at which laws can be passed. The Data Protection Directive came into force in 1995, in a world of dial-up broadband that hadn’t yet heard of YouTube, Facebook or Google. GDPR was first proposed in 2012, (a year before Twitter’s IPO) and agreed in 2014. When it came into force in 2016 (the year tiktok was founded), it was supposed to be accompanied by tougher rules around cookies and internet tracking, the ePrivacy Regulation, but this is still yet to be agreed. The ePrivacy Regulation has been subject to intensive lobbying by tech firms, but changes to the law have also been prompted in order to deal with serious crime.

Although the UK was involved in the negotiations around the ePrivacy Regulation, because it has not yet come into force it is not part of the acquis of EU law which still forms part of UK law. In the interim, the UK has taken steps to deal with online harms but has not, as yet, confirmed whether it will implement the ePrivacy Regulation when it is eventually passed.

Future-proofing regulation

The GDPR was written without reference to specific technologies to ‘future-proof‘ some of its provisions. The 1995 directive dated swiftly, so the GDPR was written in technologically neutral terms to ensure a principles-based approach. Put simply, legislating data at this point in time is analogous to legislating the use and production of oil at the start of the industrial revolution. We don’t yet fully understand its value, and there hasn’t been enough time to really understand its harms, but it is clear that it is very, very important to industry. There will be incentives to liberalise data legislation, especially for commercial use.

Yet data, much more so than oil, is a global resource, and organisations want clarity and consistency. In the US, moves towards a federal privacy law have been prompted, at least in part, because a number of states have developed their own. The most influential of these was developed in California, and it was broadly modelled on the GDPR.

So there will be two aspects to pressure from industry – the first being the pressure to deregulate, which can be seen in the negotiations leading up to the ePrivacy Regulation’s most recent draft. The second is the pressure to harmonise regulations, smoothing out differences between countries to encourage the free flow of data.

Another pressure will be from data subjects themselves. In this, California provides an instructive example, where the pressure to create the law came from individual citizens, who were more conscious of the ways Silicon Valley use their data. Elsewhere, the advent of the GDPR and the information campaigns which accompanied it have seen a sustained increase in data protection complaints to the ICO.

First a Harbour, then a Shield

Max Schrems is a prominent figure in the disparate community of privacy, information governance and free speech activists. In Schrems I he challenged Facebook’s transfer of data from Ireland to the USA on the basis of the PRISM mass surveillance program. This case was heard by the CJEU which invalidated the Safe Harbour mechanism, an agreement between the EU and the US which allowed data to be transferred from the former to the latter.

As a result of the ruling against the Safe Harbour mechanism, the EU and US developed the Privacy Shield mechanism, which Schrems challenged in Schrems II. This mechanism was also invalidated, and Facebook was ordered to stop transferring data from Ireland to the US. This decision was similarly based on concerns about mass surveillance practices by the US intelligence authorities. Some commentators have already drawn parallels between the mass surveillance of PRISM and the ‘bulk powers’ provisions of the Investigatory Powers Act in the UK, although such commentary largely preceded the adequacy decision. It is worth noting that an adequacy decision had been issued in respect of Safe Harbour and Privacy Shield.

How does all this affect Northern Ireland?

The Protocol deals with physical trade – customs, excise and regulation. It does not deal with data, and because data flows freely at the moment, it doesn’t have to. Northern Ireland exchanges data with the EU on the same legal basis as the rest of the UK, the adequacy decision having been taken on the basis of the Data Protection Act 2018 and the fact that data protection matters are not devolved. There is some flexibility, in that although data protection is not devolved to the Northern Ireland Assembly, it is a reserved matter, meaning that the Assembly could legislate in this area, subject to appropriate consents.

It is beyond the scope of this article to consider the complexities that might arise for processing cross border data if the UK decides to depart from EU standards, or if adequacy arrangements are successfully challenged in court. Nonetheless, given that NI sales to the EU were valued at approximately £6.7 billion in 2018, and services account for approximately a quarter of external trade, there are economic arguments in favour of continued alignment.

Personal data facilitates the work of cross-border bodies, but the concerns apply in both the private and public sector. The Centre for Cross Border Studies has estimated that 23,000-30,000 people are cross-border workers. If the UK and EU diverged on data protection, this could create difficulties around simple matters like payroll. Similarly, under the GDPR, if a website targets EU data subjects, then it must comply with EU standards, so this would mean that (by way of example) the NI Tourist Board website would likely have to comply with the GDPR, whatever legislation might apply in the UK. This would apply to any organisation which targeted users in the EU, such as e-commerce or social media sites.

Conclusion

There were always likely to be both costs and benefits to the UK’s exit from the EU, and that can clearly be seen in the context of data protection. For example, a more liberalised approach to health data might attract AI innovation, entice investment from tech giants like Google and help with kidney disease, but it might also attract unscrupulous behaviour, and risk damaging patient confidence in the medical profession. Similarly, the security services might be more efficient if they have access to a broader range of data, but this might fall foul of people’s expectations of privacy. It is ultimately a matter for the government to find a legislative balance, taking into account the issue of compatibility with those reached by other governments elsewhere.

Thus, data protection neatly illustrates one of the known unknowns of the UK’s exit from the EU. In navigating its legislative freedom, the UK will be subject to a number of pressures that it has recently experienced from within the EU. How it responds to those pressures, and prioritises them, will influence that journey, as well as any destination that may be in mind.

 


Exit mobile version