Christina J. Colclough
Are you involved? Data Protection Impact Assessments at Work
This blog argues that workers in the GDPR zone have a potentially very powerful tool at their disposal to govern and oversee the use of technologies in workplaces. That is assuming that the GDPRs provisions are followed. The problem is, they seldom are by the employers, and the unions will need to push for, and exercise, their rights.
Under the auspices of the European General Data Protection Regulation (article 35), organisations are obliged to conduct a Data Protection Impact Assessment (DPIA):
"Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.
In more plain language this means that a DPIA must be made when the processing of data is likely to result in a high risk to individuals. The European Data Protection Board (EDPB) elaborates, and recommends that each country make reference to, and follow, the Working Party Guidelines regarding DPIAs, and to require DPIAs if any two of the following nine criteria are present:
evaluation or scoring (which would include employee performance evaluations and applicant evaluations);
automated decision making;
systematic monitoring;
sensitive data or data of a highly personal nature;
data processing on a large scale;
matching or combining data sets;
processing data of vulnerable subjects, which include children, the elderly, and employees;
innovative use or application of technological or organizational solutions, such as using fingerprints or facial recognition for physical access control; and
processing that “prevent[s] data subjects from exercising a right or using a service or a contract.”
It is not hard to reach two of the nine in many workplace technologies. Point seven is a given as we are talking about employees. Automated or semi-automated employee performance evaluations or automated/semi-automated hiring systems are also becoming more widespread. Systemic monitoring has been reported to be on the rise, and accelerated by the Covid-19 crisis (https://t.co/RZSgXsTKsF?amp=1, https://www.bloomberg.com/news/features/2020-03-27/bosses-panic-buy-spy-software-to-keep-tabs-on-remote-workers). Or simply when new technologies are used to collect, process, and manage data (which indeed most digital tools do).
Two important (but oft overseen) requirements
So we have established that DPIAs will have to made by employers, and actually quite regularly so. We have also established that DPIAs must be conducted when they include the processing of employee data or when new digital technologies are introduced.
There are however two more requirements that all workers, shop stewards and unions should pay particular attention to, namely 1. that your views should be sought, and 2. that DPIAs should be revisited and reevaluated periodically.
Lets take point 1 first:
The Guidelines stipulate that the data controller (the employer in the case of workplaces) must:
“seek the views of data subjects or their representatives” (Article 35(9)), “where appropriate”.
The Working Party considers (page 15) that:
- those views could be sought through a variety of means, depending on the context (e.g. a generic study related to the purpose and means of the processing operation, a question to the staff representatives, or usual surveys sent to the data controller’s future customers) ensuring that the controller has a lawful basis for processing any personal data involved in seeking such views. Although it should be noted that consent to processing is obviously not a way for seeking the views of the data subjects;
- if the data controller’s final decision differs from the views of the data subjects, its reasons for going ahead or not should be documented;
- the controller should also document its justification for not seeking the views of data subjects, if it decides that this is not appropriate, for example if doing so would compromise the confidentiality of companies’ business plans, or would be disproportionate or impracticable.
Have you ever been asked?
Now I would not suggest that workers accept the wording "a question to the staff representatives". A DPIA requires much more than a single question. I would urge you to push for far more. A pertinent point to make is, have you ever been asked? It would be wrong to assume that an impact assessment is truthful in its evaluations if you are not involved. Risk is relative to the law, but also to lived experiences. Risk to whom? To the workers? To their privacy rights, human rights, sense of dignity? A truthful impact assessment would involve and consider all voices of relevance to what is under study.
Have you ever been party to a re-evaluation?
The second point relates to the periodic reevaluation of DPIAs. The Working Party adds on page 20:
"periodically review the DPIA and the processing it assesses, at least when there is a change of the risk posed by processing the operation;"
A reevaluation of an impact assessment might sound boring at first, but it is actually really important. Some AI systems or algorithms learn to learn. They are self-adjusting. We have heard of the cases where a bot learnt to swear (bbc.com/news/technology-35890188), or an automated hiring system learnt to only hire men (shorturl.at/iouAV). So what seemed like a low risk yesterday, might well be a high risk going forward. Potentially, these risks could be related to your human rights, your rights to organise, your rights to equal treatment. You should and really must demand to be party to the re-evaluation.
Where to go from here?
In a world where our fundamental rights are at stake, the issue of governing AI and all digital technologies should be a top priority for workers and their unions. Yes this is a new field in which we need to build expertise. But lets ask the question: what will be the consequences if we don't get involved?
We will blindly trust that management has understood our needs, fears and risk assessments. We will also through our non-action blindly trust an algorithm, a commodity, a thing, to do what is fair and good. But fair for whom we must ask? Anti-discrimination law will be in vogue soon again.
Many of the national data protection authorities have guidelines on how to conduct a DPIA. See the UK's site here. Informative as it is, it is weak on guidelines to workers, and to the involvement of workers in DPIAs. The Danish one doesn't seem to include these provisions at all.
So my call to action would be to urge unions and their representatives to prioritise these issues and utilise the rights to be heard that the GDPR gives you. Companies have a duty to involve the shop steward or workers. Its about time they get reminded.