The British Government has just announced (Monday 7 August 2017) that it will incorporate Directive 2016/679 (General Data Protection Regulation) along with specific derogations permitted under the GDPR as well as the Data Protection Law Enforcement Directive (DPLED) into UK law.
The move effectively repeals the current Data Protection Act 1998.
This follows a short consultation period (12 April – 10 May 2017) that called for views and which included 170 submissions from a wide range of professional bodies, legal and consumer groups, local government, technology companies, global organisations and academic institutions (7.1% of all respondents), including Henley Business School.
“Bringing EU law into our domestic law will ensure that we help to prepare the UK for the future after we have left the EU…The Data Protection Bill will allow the UK to continue to set the gold standard on data protection. We already have the largest internet economy in the G20. This Bill will help maintain that position by giving consumers confidence that Britain’s data rules are fit for the digital age in which we live,” wrote Matt Hancock MP, Minister of State for Digital in the foreword to ‘A New Data Protection Bill: Our Planned Reforms’ published by the Department for Digital, Culture, Media & Sport.
Perhaps some of the most eye-catching reforms proposed by the British Government under the Data Protection Bill is the criminalisation of intentionally or recklessly re-identifying individuals from pseudonymised and anonymised data and altering records in the wake of a Subject Access Request (SAR) made by the Data Subject.
Pseudonymised data
Pseudonymisation is a concept that’s been embraced by the GDPR to secure personal data that has a ‘high risk’ attached to it. There are specific uses of pseudonymised data that allows processing of high risk personal data without revealing the identity of the Data Subject.
Under Art.4, GDPR, pseudonymisation:
- is a technique to process personal data in such a way that it can’t be linked to a Data Subject without using the pseudonymisation key
- creates three distinct datasets: (1) dataset where the personal identification has been erased (2) the dataset consisting of the pseudonymised linking key that’s in the other two data sets (3) the dataset consisting of the personal identity data
- the process to create the pseudonymisation key should be subject to additional technical and organisational measures that keep it secret
- dataset (1) and dataset (3) should be kept separate and have distinct technical and organisational measures to secure them
- pseudonymised linking key shouldn’t contain any personal data that links it directly to a Data Subject (for example, it shouldn’t use an encrypted National Insurance Number – this is a mistake often made by Data Controllers).
Under the GDPR, anonymised data isn’t defined as personal data and therefore not subject to the GDPR.
Anonymised data
Under the GDPR, the definition of anonymised data is:
- a dataset where all the personal identification data has been erased and there are no means for identifying the Data Subject using any other data.
If the Data Controller wants to use ‘Big Data’ as a way of driving research and development for new products and services, then it will need to ensure it only processes anonymised data otherwise it will need a separate consent from the Data Subject for this new purpose.
In accordance with guidance issued in July last year by Art.29 Data Protection Working Party, Data Controllers and Data Processors should be aware that if the anonymised data can be tracked back to a Data Subject using data held by the Data Controller, it’s not deemed to be anonymised.
There are techniques to reverse pseudonymised and anonymised data using personal data in the public domain. For example, where location data and personal interests are sufficient information to accurately identify the identity of the Data Subject. This is therefore a risk when the Data Controller uses these techniques.
To understand the risk of a reversal, the Data Controller should consider:
(1) the outputs of the pseudonymisation and anonymisation process
(2) data in the public domain
(3) or data that’s easily available (ie. Big Data) that can enable reversal to take place.
If reversal of pseudonymised and anonymised data takes place, and the personal data is then in the public domain, this would then become a personal data breach and the Data Controller and/or Data Processor will be subject to investigation by the Supervisory Authority.
New criminal offence under UK law
Under the proposed Data Protection Bill, there will be a new criminal offence of intentionally or recklessly re-identifying individuals from pseudonymised or anonymised data. It will be the job of the accountable individual at Board level to ensure that this doesn’t happen and those who within the organisation knowingly handle or process such data will also be guilty of a criminal offence. The maximum penalty would be an unlimited fine.
In non-criminal cases, it’s highly likely the Information Commissioner’s Office (ICO) will deem the technical and organisational measures not to have been appropriate. If the Data Controller fails to immediately halt the pseudonymisation or anonymisation process, it’s almost certain that the ICO will order a temporary or permanent ban on this personal data processing until remedied in addition to a fine of £17m (€20m) or up to 4% of global turnover, whichever is greater.
There may be additional pseudonymised or anonymised data in the public domain sourced from the Data Controller that can be reversed. This would increase the extent of the initial personal data breach and unless the Data Controller takes immediate action this will turn into a doomsday scenario for the Data Controller with significant consequences!
Altering records in the wake of a Subject Access Request (SAR)
The British Government also intends to create a new criminal offence of altering records with intent to prevent disclosure following a Subject Access Request (SAR) exercised by an individual under Art.15, GDPR.
This new offence would use Section 77, Freedom of Information Act 2000 as a template and would apply to a Data Controller and Data Processor in the public or private sectors. The maximum penalty would be an unlimited fine in England and Wales or a Level 5 fine in Scotland and NI.
“Protecting data is a global concern and the UK is at the forefront of innovation in this area,” concludes the British Government’s Statement of Intent.
For information about the GDPR Transition Programme at Henley Business School, click here.
Recent Comments