Data Protection Nuggets Part 2












Copyright DPN
The information provided and opinions expressed in our content represent the views of the Data Protection Network and our contributors. They do not constitute legal advice.
We often talk about the risks of holding onto personal data for too long. The need to make sure data is destroyed when it’s no longer required and how the impact of a data breach could be far worse if it involves personal records which shouldn’t have kept. But now we have a case where it’s the destruction of records which caused a data breach.
The Scottish charity Birthlink has been fined £18,000 by the ICO for destroying approximately 4,800 records, some of which were irreplaceable photographs and letters.
The findings make for sobering reading. A catalogue of errors; lack of accountability, lack of policies and procedures, no appropriate data protection training and a failure to report a data breach for more than two years.
Birthlink has maintained the Adoption Contact Register for Scotland since 1984. This is a service for adopted people or their relatives, and for birth parents or their relatives. It enables people to register their details with the hope of being ‘linked’ and potentially reunited.
Where a link is made, records are classified at “Linked Records”, and the personal data contained within such records can include sensitive documents such as:
■ Original birth certificates
■ Adoption Contact Register application form
■ Correspondence between Birthlink and service users
■ Other information relevant to the adoption
■ Irreplaceable items (e.g. handwritten letters from birth parents and birth families, photographs and other sensitive personal information)
These are physical documents relating to adopted people’s individual circumstances, which the charity held in filing cabinets.
In January 2021 Birthlink was running out of space in the filing cabinets the Linked Records were stored in, so assessed whether they could destroy them. After a board meeting it was agreed there were no barriers to the destruction of the records, that retention periods should apply and only replaceable records should be destroyed.
However, it’s evident from the enforcement notice this was very badly managed. Due to poor records management, bags of paperwork were destroyed without a full understanding of what the documents entailed. To make matters worse, despite concerns being raised at the time about shredding people’s photographs and letters, the destruction continued.
More than two years later and following an inspection by the Care Inspectorate, the Board became aware irreplaceable items had in fact been destroyed. It was only then the data breach was reported to the ICO.
And the woeful tale continues. Poor record keeping means not only will the extent of what was destroyed never be fully known, Birthlink have also been left unable to identify people affected by the breach.
Routinely in an article like this I’d write a bit about the key findings, but in this case I think they speak for themselves. You’ll not be surprised to learn Birthlink says there was limited knowledge of their data protection obligations at the time this breach took place.
Sally Anne Poole, Head of Investigations at ICO, said:
“It is inconceivable to think, due to the very nature of its work, that Birthlink had such a poor understanding of both its data protection responsibilities and records management process. We do however welcome the improvements the charity has subsequently put in place, not least by appointing a data protection officer to monitor compliance and raise awareness of data protection throughout the organisation.
“Whilst we acknowledge the important work charities do, they are not above the law and by issuing and publicising this proportionate fine we aim to promote compliance, remind all organisations of the requirement to take data protection seriously and ultimately deter them from making similar mistakes.”
It’s too easy to see the mistakes here, and easy to pour scorn on Birthlink. However, all organisations will recognise taking a robust approach to data retention can be challenging to deliver in practice.
Many organisations face a careful balance between destroying personal data they have no justification for holding on to, and making sure they continue to retain records they still need to keep. Robust records management procedures, secure storage and archiving, clear data retention periods, and clear authorisation when the time comes for destruction are crucial – especially when handling sensitive information.
Sometimes a specific law tells us how long certain records should be kept, or personal data needs to be retained to meet contractual obligations. Often we need to consider people’s reasonable expectations – would they expect us to be still holding on to their personal details or not?
In the case of Birthlink, the answer was almost undoubtedly, yes, people would have expected irreplaceable records to be retained, or perhaps returned to them, rather than destroyed.
I can’t stress enough to effectively tackle data retention it needs shared ownership – clear accountability with assigned roles and responsibilities across the organisation. Good data governance is the key.
If this has given you an unwelcome nudge to revisit your approach to retention, see our 3 Steps to decide your data retention periods and our detailed Data Retention Guide.
When will provisions under the Data Use and Access Act 2025 (DUAA) take effect and when we can anticipate guidance to be published by the Information Commissioner’s Office?
The DUAA received Royal Assent on 19th June but while limited provisions came into effect immediately, the majority will be phased in over the coming months up to June 2026, with some requiring secondary legislation to be passed.
To be crystal clear, the DUAA does not replace UK GDPR, the Data Protection Act 2018 or the Privacy and Electronic Communications Regulations (PECR). The Act brings in amendments to these core pieces of legislation, much in the same way PECR was amended in 2009 with the so-called ‘cookie law’.
With immediate effect: One provision which has come in with immediate effect is clarification that when responding to Data Subject Access Requests (the right of access) organisations only need to undertake a “reasonable and proportionate search”. This inserts a new Article 15(1A) into UK GDPR, and gives a statutory footing to existing case law and guidance from the ICO.
From 20th August 2025 the following amendments will come into force:
■ Information Commissioner can serve notices by email
This amends the Data Protection Act 2018 with a new section 141A permitting notices to be served by email. You may want to double check the email address the ICO has on file for your organisation on the register of fee payers, make sure this is regularly monitored and who/which team a notice should be immediately forwarded to.
■ Information Notices and ICO power to ask for documentation
This grants the ICO the power to require organisations to provide documents as well as information when responding to an Information Notice.
Other measures commencing on 20th August include requirements for the Government to prepare a progress update and report on copyright and AI.
From September/October: Commencement is expected of measures on digital verification services.
Around December: Commencement of main changes to data protection legislation.
At present we don’t have precise dates for when specific provisions such as the soft opt-in for charities, changes to the cookie rules and recognised legitimate interests will commence, but we’ll update this article as and when we hear more. For a top-level summary of the Act see DUAA 2025: 15 key changes ahead.
The ICO has published a timeline of when we can expect updated or new guidance covering the changes the DUAA ushers in.
Summer 2025
■ Data Subject Access Requests – update to detailed Right of Access guidance
■ Substantial public interests conditions – a new interactive tool
■ Cookies & similar technologies (Part 1) – update to ‘cookie guidance’ and renamed ‘guidance on storage and access technologies’.
Winter (2025/26)
■ Direct marketing and Privacy and Electronic Communications Regulations guidance – update to existing guidance
■ Complaints procedures – new guidance for organisations on how to handle data protection complaints
■ Lawful basis of recognised legitimate interests – new guidance
■ Legitimate interests – update to existing guidance
■ International data transfers guidance – update to existing guidance
■ Cookies & similar technologies (Part 2) – (‘guidance on storage and access technologies’).
■ The purpose limitation principle– updated and enhanced guidance
■ Anonymisation and pseudonymisation for research purposes – guidance
Spring 2026
■ Automated Decision Making (ADM) and Profiling – updated guidance
■ Research, archiving and statistics provision – updated guidance.
■ SME data essentials – guidance
More detail and other updates from the ICO can be found here: plans for new and updated guidance.
The ICO will also in due course be producing codes of practice on edtech and artificial intelligence.
There’s lots to watch out for and we’ll try our best to keep you up to date with developments as and when they happen.
The Data Use and Access Act (DUAA) introduces changes to the concept of legitimate interests under UK GDPR. Once provisions take effect there will be a seventh lawful basis of recognised legitimate interests and legal clarity on activities which may be considered a legitimate interest.
The DUAA amends Article 6 of GDPR to expand the six lawful bases for processing to seven, to include recognised legitimate interests. While a necessity test will still be required, for the following recognised legitimate interests there will no longer be a requirement for an additional balancing test (Legitimate Interests Assessment):
■ Disclosures to public bodies, or bodies carrying out public tasks where the requesting body has confirmed it needs the information to carry out its public task.
This means private and third sector organisations which work in partnership with public bodies will just need confirmation the public body needs the information to carryout out its public task. This is likely to give more confidence to organisations (such as housing associations and charities) when sharing information with public sector partners.
Data Sharing Agreements, Records of Processing Activities (RoPAs) and privacy notices may need to be updated to reference recognised legitimate interests as the lawful basis where appropriate. Staff training may also need updating.
■ Safeguarding vulnerable individuals – this allows for the use of personal data for safeguarding purposes. There are also definitions given for the public interest condition of “safeguarding vulnerable individuals”, which the ICO has written more about here.
■ Crime – this allows use of personal information where necessary for the purposes of detecting, investigating or preventing a crime; or apprehending or prosecuting offenders.
■ National security, public security and defence – this allows the use personal information where necessary for purposes of safeguarding national security, protecting public security or defence.
■ Emergencies – this allows use personal information where necessary when responding to an emergency. An emergency is defined by the Civil Contingencies Act 2004 and means an event or situation with threatens serious damage to human welfare or the environment, or war or terrorism which threatens serious damage to the security of the UK.
The ICO is planning to publish guidance on recognised legitimate interests over Winter 2025/26. For a timeline of when we can anticipate other DUAA related guidance from the ICO see DUAA – Next Steps.
There are some examples of activities which may be considered a legitimate interest in the recitals of UK GDPR. As such they provided an interpretation of the law but were not legally binding. DUAA moves the following examples of legitimate interests from the recitals into the body of the law:
■ direct marketing
■ intra-group sharing of data for internal administrative purposes, and
■ processing to ensure network and information security.
This may give organisations more confidence when relying on the lawful basis of legitimate interests however, unlike recognised legitimate interests, the above will still be subject to a Legitimate Interests Assessment.
The core rules under the Privacy & Electronic Communications Regulations (PECR) are not changing – unless you’re a charity wishing to benefit from the ‘soft opt-in’. For direct marketing activities, legitimate interests will still only be an option for specific marketing activities which don’t require specific and informed consent under PECR.
An update to both the ICO’s Legitimate Interests Guidance and PECR guidance is expected in Winter 2025/26.
One of the fundamental data protection principles is that our handling of personal data must be ‘lawful, fair and transparent’. To be lawful, clearly, we shouldn’t do anything illegal in general terms. But what else does it mean to be lawful?
We’re given six lawful bases to choose from under UK/EU GDPR. For each purpose we use personal data for, we need to match it with an appropriate lawful basis.
For example a purpose might be:
We need to select the most appropriate lawful basis and meet its own specific requirements. Each basis is equally valid, but one may be more appropriate than others for any specific task. We’re legally obliged to set out the lawful bases we rely on in our privacy notices.
If none of them seem to work, you may want to question whether you should be doing what you’re planning to do.
(This is not intended to be exhaustive, do check the ICO’s Lawful Basis Guidance)
This lawful basis will be appropriate if you need to process an individual’s personal information to deliver a service to them. Or you need collect certain details to take necessary steps before entering into a contract or agreement.
Example 1: An individual purchases a product from you and you need to handle specific personal information about them in order to deliver that product, including when you acknowledge their order, provide essential information, and so on.
Example 2: Someone asks you to give them a quote for your services, and you need certain information about them in order to provide that quote.
Contract tips:
There may be circumstances where you are legally obliged to conduct certain activities, which will involve processing personal data. This could be to comply with common law or to undertake a statutory obligation.
Example 1: You are offering a job to someone outside the EU. You need to check they have a visa to work in the UK, as this is a legal obligation.
Example 2: Airlines and tour operator collect and process Advance Passenger Information (API) as this is a legal requirement for international air travel.
Legal obligation tips
You can collect, use or share personal data in emergency situations, to protect someone’s life.
Example: A colleague collapses at work, is unable to talk, and you need to tell a paramedic they have a medical condition. Common sense should prevail.
Vital interest tips
You can process personal data if necessary for public functions and powers that are set out in law, or to perform a specific task in the public interest.
Most often this basis will be relied upon by public authorities and bodies, but it can apply in the private sector where organisations exercise official authority, or carry out tasks in the public interest.
Public task tips
This is the most flexible lawful basis, but don’t just assume what you’re doing is legit. It’s most likely to be appropriate when you use people’s data in a way they’d reasonably expect. Where there is minimal impact on them, or where you have a compelling justification.
Legitimate interests must be balanced. You must balance the organisation’s interests against the interests, rights and freedoms of individuals. If your activities are beyond people’s reasonable expectations or would cause unjustified harm, their rights and interests are likely to override yours. Legitimate interests – when it isn’t legit
Legitimate Interests tips
Important note: In June 2025 the UK Data (Use and Access) Act introduced a new lawful basis for processing into the UK GDPR. This lawful basis of ‘recognised legitimate interests’ can be relied up by organisations for specific purposes without being required to conduct a balancing test (i.e. a Legitimate Interests Assessment). The list of recognised legitimate interests includes the following (and may be expanded):
■ Disclosures to public bodies, where it is asserted personal data is necessary to fulfil a public function.
■ Disclosures for national or public security or defence purposes, emergencies.
■ Disclosures for prevention or detection of a crime, and safeguarding vulnerable individuals.
This is when you choose to give individuals a clear choice to use their personal details for a specific purpose and they give their clear consent for you to go ahead. The law tells us consent must be a ‘freely given, specific, informed and unambiguous’ indication of someone’s wishes given by a ‘clear affirmative action’.
Consent is all about giving people a genuine choice and putting them in control. They must be able to withdraw their consent at any time, without a detrimental impact on them. Consent, getting it right.
Consent tips:
In summary, consider all the purposes you have for processing personal data. Assign a lawful basis to each purpose and check you’re meeting the specific requirements for each basis. Tell people in your privacy notice the lawful bases you rely on, and specifically explain your legitimate interests.
Finally, don’t forget, if you’re processing special category data (for example data revealing racial or ethnic origin, health data or biometric data) you’ll need a lawful basis, plus you’ll need to meet one of the conditions under UK GDPR Article 9. For criminal convictions data you’ll need a lawful basis, plus one of the conditions under UK GDPR Article 10.
The Data Use and Access Act 2025 received Royal Assent on 19 June. Implementation of the new law will commence in phases with most provisions expected to come into force within two to six months, while some may take up to a year.
The key objectives of the DUA Act involve enabling data sharing and the introduction of digital verification schemes. Alongside this, we’ll see amendments to UK GDPR, the Data Protection Act 2018 and the Privacy & Electronic Communications Regulations (PECR). The level of impact will very much depend on your sector and data processing activities.
No radical shake-up
While significant, this legislation does not usher in radical changes and organisations do not face a big shake up of their approach to data protection compliance. This is not GDPR 2.0. The fundamental principles and obligations for data protection remain unchanged. We predict it will be business as usual for the majority of organisations, with some changes here and there.
Time to prepare
While limited provisions may take immediate effect, there will be time to prepare before the majority of provisions take effect, possibly up to 15 months after the law is enacted. The precise timescales have yet to be published, and we’d advise keeping abreast of developments, and ICO guidance as it comes out. Nothing needs to be done right away.
AI transparency and copyright not included
It’s worth noting the House of Lords lost its battle on AI. A key sticking point, which stalled progress of the Bill until now, was the Lords introducing successive amendments to transparency requirements for data used to train AI models, and the use of copyright materials to train AI. In the end these attempts failed, but an agreement was reached with the Government to publish a report on copyright and AI proposals in the coming months.
UK GDPR currently places strict restrictions on automated decision-making (including profiling) which result in legal or similarly significant effects. This will be relaxed so it only applies to automated decisions using special category data. With any other personal data, there will be a requirement to put in place certain safeguards, such as giving individuals the ability to contest decisions and request human intervention.
This change will give organisations more flexibility to make automated decisions using personal data (but not special category data). For example, when utilising AI systems. To prepare for this change, re-assess your use of solely automated decision-making and look to review relevant processes and policies.
As part of the recently launched ICO AI and Biometrics Strategy, the regulator has committed to:
■ updating its guidance on automated decision making (ADM) and profiling by autumn 2025
■ a public consultation on this updated guidance
■ developing a statutory code of practice on AI and ADM
Provisions to be introduced on DSAR handling give a statutory footing to existing ICO guidance. In practice this is unlikely to mean any significant changes if you’re already following regulatory guidance, but it does give a degree of extra confidence by being written into UK law. The key points are:
■ the timescale for responding within one calendar month does not start until the organisation is satisfied the requestee is who they say they are
■ when seeking clarification, the clock can be paused while awaiting the individual’s response
■ organisations can conduct a “reasonable and proportionate” search for personal data.
When withholding information is based on legal professional privilege or client confidentiality, a new requirement will mean organisations have to explicitly inform individuals about the specific exemption being applied and the reasons. Individuals will also have the right to request the ICO reviews how these specific exemptions have been applied.
To prepare, you can start to review your current DSAR procedure, if relevant plan how to update response templates to include more explicit information and bolster internal documentation used to justify reliance on these exemptions.
The obligation to provide privacy information to individuals (e.g. under Article 14, UK GDPR) will not apply if providing this information “is impossible or would involve disproportionate effort”.
This is most likely to be particularly relevant where organisations have gathered personal data indirectly, i.e. not directly from the individuals. This was a point of contention in the Experian vs ICO case, where Experian argued it would be disproportionate effort to notify and provide privacy information to the millions of people whose data they process from the Edited Electoral Roll.
The legislation includes a new right for individuals to raise complaints related to use of their personal data. These new rules will require controllers to make sure they have clear procedures to facilitate complaints, including providing a complaint form. Complaints will require a response within 30 days. Alongside this, certain organisations may also be obligated to notify the ICO of the number of privacy-related complaints they receive during a specified time period.
Some sectors, such as financial services and those which fall in scope of FOI requests, are already obliged to have complaints procedures in place to meet their legal obligations. These may need adapting to cover these new requirements while for others, procedures will need to be put in place. Privacy notices will also need to be updated to reflect this change.
“The processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest”. This not insignificant line currently rests in a GDPR recital, and as such it’s not legally binding and simply provides a helpful interpretation of the law. However, under the DUA Act it will unambiguously set in stone that legitimate interests is an acceptable lawful basis for direct marketing purposes.
While there are concerns this will lead to more ‘spam’ marketing, I’d stress the direct marketing rules under PECR will still apply, so legitimate interests will only be an option when the law doesn’t require consent.
The concept of ‘recognised legitimate interests’ is to be introduced, whereby organisations will not be required to conduct a balancing test (i.e. Legitimate Interests Assessment) when relying this lawful basis – but only for specific, recognised purposes. The list of recognised legitimate interests includes the following (and may be expanded):
■ Disclosures to public bodies, where it is asserted personal data is necessary to fulfil a public function.
■ Disclosures for national or public security or defence purposes, emergencies.
■ Disclosures for prevention or detection of a crime, and safeguarding vulnerable individuals.
In preparation, you can start by reviewing processing activities which rely on legitimate interests and assess if any will become ‘recognised’. I can see this being particularly helpful for private and third sector organisations which have direct relationships with public bodies involving the sharing of personal data.
The use of the ‘soft opt-in’ exemption to consent for electronic marketing will be extended to charities. This means charities will be able to provide supporters and donors with an ‘opt-out’ mechanism rather than an ‘opt-in’ to marketing emails (and/or SMS), as long as the following specific conditions are met:
■ The sole purpose of the direct marketing is for the charity’s own charitable purpose(s)
■ Contact details were collected when the individual expressed an interest in the charity’s purpose(s) or offered or provided support to further the charity’s purpose(s).
■ An opportunity to refuse/opt-out is given at the point of collection, and in every subsequent communication.
To prepare charities can consider whether they wish to switch from consent, and assess if this will relatively straight-forward to implement in practice or not. Pros and Cons of the ‘soft opt-in pros.
The DUA Act will include extending the exceptions to consent from only ‘strictly necessary’ to include other specific types of ‘low risk’ cookies and similar technologies. The exemption will be permitted for certain statistical purposes and optimising website appearance, as long as clear information is provided and users are given a straight-forward ability to opt-out.
Alongside these changes under DUA, the ICO is reviewing PECR consent requirements to in its words; “enable a shift towards privacy-preserving advertising models”. This autumn, a statement is expected on ‘low risk’ advertising activities which in the ICO’s view are unlikely to cause harm or trigger enforcement action. You can read more about this in the ICO’s package of measures to drive economic growth.
In preparation, cookie audits can be conducted to identify which cookies used may qualify as ‘low-risk’, and prepare to update your consent management platform (CMP) and the cookie information provided.
Fines for infringements of the Privacy & Electronic Communications Regulations, which govern electronic direct marketing, cookies and similar technologies, are set to significantly increase.
Currently the maximum fine under PECR is currently capped at just £500k. The limits will be brought in line with the much more substantial fines which can be levied under UK GDPR – up to a maximum of £17,500,000, or 4% of the organisation’s total annual worldwide turnover from the preceding financial year, whichever is higher.
Bear in mind the ICO issues more fines under PECR than UK GDPR or DPA, so the message is clear; make sure you comply with the PECR rules as the cost of enforcement action could be far higher.
It’s also worth noting what constitutes ‘spam’ is to be extended to include emails and text messages which are sent, but not received by anyone. This will mean the ICO will be able to consider much larger volumes in any enforcement action.
Currently, UK GDPR makes it tricky to reuse personal data for new purposes, and DUA Act aims to make this slightly easier by listing specific compatible purposes for which organisations will not need to undertake a compatibility assessment.
There are detailed changes in relation to scientific research. To briefly summarise, the definition of ‘scientific research’ is to be clarified and will explicitly state research can be a commercial or non-commercial activity. Consent for scientific research is to be adapted, in part driven by a desire to make it easier for personal data collected for specific research to be reused for other scientific research purposes.
When assessing appropriate ‘technical and organisational measures’ in relation to online services likely to be accessed by children, organisations will be legally obliged to take account of how children can best be protected right from the design phase, confirm that children merit additional protection, and have different needs at different ages and stages of development. Such measures strengthen the need to adhere to the UK Children’s Code.
The DUA Act will give the Government the ability to pass secondary legislation to enable business data sharing. The aim is to implement Smart data schemes to grow the UK economy, encourage competition and benefit consumers. Currently we have data sharing models for open banking, and the plans is similar models will be extended to other sectors such as telecoms, healthcare, insurance and energy.
The Act will create a framework to enable the introduction of trusted digital verification services. The idea is people will be able to prove their identity via trusted digital identify providers, without having to provide a physical form of ID or other form of documentation.
Digital ID verification has been adopted successfully by certain businesses, but take up is patchy and the Government is keen to accelerate progress. It’s hoped this new framework will simplify processes such as registering births and deaths, starting a new job, and renting a home.
The Information Commissioner’s Office is set to be replaced by an Information Commission, which will be structured in a similar way to the FCA, OFCOM and the CMA – as a body corporate with an appointed Chief Executive. It’s anticipated this change will come into effect in 2027.
The DUA Act will be carefully scrutinised by the European Commission when it reviews adequacy decisions for the UK. These currently allow for the free flow of personal data between the EEA and UK, without the need for additional risk assessments or safeguard measures. The outcome of the EC review of these decisions is expected in December 2025. It’s hoped there’s nothing to scare the horses and UK adequacy will be renewed. Nonetheless, this is one to watch.
In summary, although reform has its critics, the changes to be introduced by the DUA Act are not overly dramatic. More detail and regulatory guidance will gradually become available, and I’d stress there’s no need to do anything immediately. Over the coming months we’ll be sure to keep you updated on developments.
When GDPR came into force more than seven years ago, it made it mandatory for certain organisations to appoint a Data Protection Officer (DPO) – certainly not all organisations. As a result there are more than 500,000 organisations with Data Protection Officers registered across Europe, according to IAPP research.
But even after so long, a good deal of confusion remains about which organisations need to appoint a DPO, and what the role actually entails. The DPO isn’t just a title you can dish out to whoever you choose.
The law tells us organisations must appoint a DPO if you’re a Controller or a Processor and the following apply:
■ you’re a public authority or body (except for courts acting in their judicial capacity); or
■ your core activities require large-scale, regular and systematic monitoring of individuals (for example, online behaviour tracking); or
■ your core activities consist of large-scale processing of special categories of data or data relating to criminal convictions and offences.
This raises questions about what’s meant by ‘large-scale’ and what happens if your organisation falls within the criteria above but fails to appoint a DPO. When it comes to interpreting ‘large-scale’ activities, the European Data Protection Board Guidelines on Data Protection Officers provide some useful examples.
Despite the previous Conservative government’s data reform proposals including the removal of DPO role, I should stress under the soon to be enacted Data (Use & Access) Act, these requirements remain unchanged.
Many small to medium-sized organisations won’t fall within the set criteria for mandatory appointment of a DPO. For many organisations, their processing is neither ‘large scale’ nor particularly sensitive in nature.
The ICO tells us all organisations need to have ‘sufficient staff and resources to meet the organisation’s obligations under the UK GDPR’. So, if you assess you don’t fall under the mandatory requirement, you have a choice:
■ voluntarily appoint a DPO, or
■ appoint an individual or team to be responsible for overseeing data protection. You can take a proportionate approach, based on the size of your organisation and the nature of the personal data you handle.
Many organisations don’t realise the law sets out the DPO’s position and their specific responsibilities. If you have a DPO, their responsibilities are not optional or up for debate. The law tells us DPOs must:
■ report directly to the highest level of management
■ be an expert in data protection
■ be involved, in a timely manner, in all issues relating to data protection
■ be given sufficient resources to be able to perform their tasks
■ be given the independence and autonomy to perform their tasks
It’s worth stressing appointing a DPO places a duty on the organisation itself (particularly senior management), to support the DPO in fulfilling their responsibilities. As you can see above, this includes providing resources, and enabling independence and autonomy.
Not just anybody can be your DPO. While they can be an internal or external appointment, and one person can represent several different organisations, steps should be taken to make sure there are no conflicts of interest. A CEO being the DPO, or the Head of Marketing might be obvious examples of where a conflict could easily arise.
The law sets out the DPO must perform their role in an independent manner. Their organisation shouldn’t influence which projects they should be involved in, nor interfere with how to execute their role. A DPO therefore needs to someone of character and resilience who can stand their ground, even in the face of potential conflict.
When it comes to being an ‘expert’, there’s a judgement call to make, as the law doesn’t specify particular credentials or qualifications. The level of experience and specialist skills can be proportionate to the type of organisation and the nature of the processing.
The formal set of tasks a DPO is required to perform are as follows:
■ inform and advise the organisation and its employees about their obligations under GDPR and other data protection laws. This includes laws in other jurisdictions which are relevant to the organisation’s operations.
It’s worth noting the DPO is an advisory role, i.e. to advise the organisation and its people. Their role is not to make decisions on the processing activities. There should be a clear separation between advisor and decision-maker roles. The organisation doesn’t need to accept the advice of their DPO, but the DPO would be wise to document when their advice is ignored. In many smaller organisations people may undoubtedly be spinning multiple plates and will need to do some (or plenty) of the ‘doing’ work.
■ monitor the organisation’s compliance with the GDPR and other data protection laws. This includes ensuring suitable data protection polices are in place, training staff (or overseeing this), managing data protection activities, conducting internal reviews & audits and raising awareness of data protection issues & concerns so they can be tackled effectively. This doesn’t mean a DPO has to write every data protection related policy, or stand up and deliver training.
■ advise on, and to monitor data protection impact assessments (DPIAs).
■ be the first point of contact for individuals in relation to data protection and for liaison with the ICO.
A DPO must also be easily accessible, for individuals, employees and the ICO. Their contact details should be published, e.g. in your privacy notice (this doesn’t have to include their name) and the ICO should be informed you’ve appointed a DPO.
A DPO shouldn’t be penalised for carrying out their duties. The ICO points out a DPO’s tasks cover all the organisation’s processing activities. Not just those which required a DPO to be appointed – such as ‘large scale processing of special category data’. However, the ICO accepts a DPO should prioritise and focus on more risky activities. ICO Data Protection Officer Guidance.
We’d always advise making sure a DPO’s responsibilities are clearly set out in a job description, to save any debate about the role. It’s helpful to make sure the management team and key stakeholders are briefed on the DPO’s legal role.
What’s clear is being a DPO requires many qualities, and a broad skill set, which we’ve written more about here: What does it take to do the job?
Get DPN updates direct to your inbox. Insight, free resources, guides, events & services from DPN Associates (publishers of DPN). All our emails have an opt-out. For more information see our Privacy Statement.