GDPR was not the last hurdle. The enormous effort undertaken by organizations three years ago to comply with the framework was part of a never-ending race. GCs and DPOs find themselves in the midst of a battle between the ever-changing world of technology and data collection, and the ever-changing world of data regulation and public opinion.
It’s a delicate balance between fostering a progressive organization that takes advantage of new technological advances and remaining attentive to privacy and customer expectations.
Finding this balance is increasingly important as the world emerges from the pandemic into a virtual future. Meanwhile, the UK’s exit from the European Union has created uncertainty over its alignment with the GDPR and its alignment status with Brussels.
This roundtable, moderated by Gareth Oldale, TLT partner and head of data, privacy and cybersecurity, aimed to explain how GCs and DPOs can best help their businesses navigate this complex terrain. Oldale was joined by leading in-house lawyers and data managers from organizations across the UK.
Brexit and international data transfers, finally clarity?
The General Data Protection Regulation (GDPR) was published by the EU in April 2016, two months before the Brexit vote. In the years that followed, the UK’s relationship with the Regulation, which also governs the transfer of data outside the EU, has been the subject of much guesswork and change.
The UK was finally granted EU adequacy status in June this year. The adequacy status allows for the free flow of data to and from the EU from an external territory which has been considered to provide adequate protection of individuals’ data rights and freedoms.
Oldale said, “The adequacy agreement fills a huge gap in people’s data transfer patterns.”
When the announcement was made, the EU justice commissioner added: “The Commission will closely monitor developments in the UK system in the future and we have stepped up our decisions to allow this and for intervention if needed.”
As such, the job of the GC or DPO is never really done. The UK’s adequacy status is under continuous review and is in danger of being revoked every four years. The revocation would mean that UK businesses would lose access to EU data and would have to rely on the European Commission’s standard contractual clauses or other appropriate guarantee for data transfers from the EU to the UK. United, posing a logistical nightmare for UK data. teams.
An immediate challenge for participants is to adapt to both the new SCC documents from the European Commission and the UK. The European Commission released its new version of the documents in early June, while the UK Information Commissioner’s Office (ICO) has yet to release theirs.
“The key question is to correct your EU contracts with the new SCCs now, or do you wait for the ICO to release their version and then do it all in one fell swoop?” Oldale said.
Roundtable delegates generally agreed that it is best to sit still, prepare contracts that you think need revamping, and wait for the UK SCCs to be released.
Two potential hills to climb in this process were then discussed. The first concerned a clause in the new CCPs which stipulates that data controllers must undertake a data transfer impact assessment before transferring data to a third country, in order to determine whether there is an essential equivalence with the GDPR. “This is likely to be difficult in many cases, because in the 25 years since the EU was tasked with determining whether third countries offer equivalent protection (through the adequacy process), only 14 countries were found to be adequate, and now that burden is in some ways on you, ”Oldale said.
The focus then shifted to using SCCs when using third-party providers, such as cloud service providers.
The new SCCs say that you don’t need to use them if the foreign provider you’re contracting with is already GDPR-compliant. Oldale said: “My experience is that clients are really not comfortable with this, and it’s alarming that you don’t have anything in the contract, so you might feel the need to use CSCs anyway.”
AI and regulation
GDPR has been a major milestone in data regulation, but the regulatory landscape continues to develop as new technologies emerge.
A major focus for the UK ICO going forward is technology bias. During last year’s exam fiasco, the UK government sought to determine exam scores through an algorithm that would factor in a school’s address, postcode and ranking alongside strict academic performance . There has been a backlash against this, as top performing students in low income areas have occasionally seen their grades drop.
One technology at the forefront of this debate is live facial recognition (LFR) software. The LFR software will read the dimension / characteristics of the faces and compare them to a list of people of interest. In a law enforcement context, this can be used by police to search for suspects in a particular area. In the private sector, retailers could use the LFR to monitor shoplifters.
Bias is a major concern here, as LFR technology has been shown to be less accurate for reading female and black faces. A black woman is much more likely to receive a false positive match than a white man.
In recent weeks, the Information Commissioner has issued a statutory opinion on the matter. She called for a statutory code of practice to be implemented by the government, outlining where and when LFR technology can be used.
“The discussion around LFR is relevant to the future of any AI technology that impacts the individual. The key themes are loss of freedom and financial harm resulting from using an algorithm or other AI technology to make decisions about people, ”said Oldale.
Regarding algorithms, one delegate explained that: “Most of the advice we have received [when learning how to use algorithms compliantly] is to explain and be transparent. By being open with our customers, we can counter prejudices.
Roundtable delegates largely agreed that transparency is essential when implementing any machine learning, algorithm, or AI technology.
“Without any prescriptive regulations, doing what you can to apply the fundamentals of GDPR to everything you do will serve you well,” said Oldale. “It varies depending on the audience. You have to take into account the language and wording you use.
Transparency means being easily understood. When GDPR was first introduced, organizations often complied by posting complex and generally inaccessible resources that most members of the public wouldn’t have the time and willingness to read.
The accessibility of privacy notices, for example, is key to gaining a customer’s trust in an organization’s use of data.
Just because you can, doesn’t mean you should
The ICO emphasized the importance of implementing technology with a purpose, not implementing technology for technology.
“Employees and the general public have sympathy for the organizations that have deployed technology in haste due to the pandemic,” Oldale explained. “As we begin to break out of lockdown, the patience for intrusive technology will run out. “
Oldale was directly referring to organizations that have implemented technology to monitor their employees at work.
For example, Amazon has developed a product that can determine whether your staff is respecting social distancing. Once the pandemic finally subsides, privacy activists could quickly refer to this type of technology as reminiscent of Orwell’s 1984.
“It all comes down to putting in place strong data protection impact assessments and making sure they’re being followed,” Oldale said.
The discussion highlighted the human side of using data. The ability to implement the technology may exist, and you may be well within your legal rights to do so. But is it the right thing to do? Is this the image of your organization that you want to present?
Just because you can, doesn’t necessarily mean you should.