• Christina J. Colclough

Your Digital Selves' Rights

In this article for the Danish Insurance Union, we zoom in on how the many digital identities formed on us as workers and citizens can harm career opportunities and the quest for diverse and inclusive labour markets if unions don't start pushing back now.


Read the full Danish article here and via images below. Read the original English version below.


Your digital selves' rights

What happens to all of the photos, google map searches, social media posts, workplace productivity scores, evaluations and “profiles” you have made and be subjected to as you get older, and even die?


This might sound like a really odd question, but think about it for a second. As we discussed in the article: It’s Not Just About You, all of your data actually has an impact on other people - not just you. Currently, once data has been gathered, used in algorithms or inferences (remember those often damming profiles that manipulate our life and career chances and those of others too), its out there. It gets replicated, shared, sold, rebundled with other information and sold again. Without a “data life length” - it can live on forever, even when you are gone. This raises many questions: Will we forever be judged against things we did when we were young? Will all of our data profiles continue to affect the live chances of others, even when we are dead? Will your work life opportunities as you grow older be limited by whether you had more sick days than average when you were in your 30s, or have been overweight judging by the norm since you were 40? Are you still investible for an employer?

We need to ask these questions as our digital selves (yes we have potentially many selfs), even those we have no idea have been created, have not necessarily been given a natural “out-of-date” “out-of-life” stamp.


When we think about it, all of this is really problematic. Your career chances as you grow older can be limited by how others like you, now long passed away, managed to perform. Were they slow - too slow? Were they less adaptable? Softer and therefore more suitable for customer call centres than high speed trading or data crunching? Age discrimination happens. On DR P1 on Monday April 26, a headhunter was interviewed. He reported he numerous times had been asked by clients to screen applicants according to their gender and/or age.


I have raised a number of warning flags that we need to take seriously in this datified world of ours. In articles and blogs on the Why Not Lab I have offered some solutions and especially highlighted the key role of trade unions and collective bargaining in turning the tides and making sure our digitalised work life is inclusive, diverse and respectful of our fundamental rights. The Danish Insurance Union has now asked me to suggest some concrete solutions and policies that unions beneficially should explore. I will do so in this article and two more to come.


Now back to the rights of your digital selves. How can we avoid being subject to manipulations based on old data or data from folks now long gone? How can we ensure that our life’s don’t continue to affect others, when we ourselves have passed away?


The GDPR gives us a helping hand

The European General Data Protection Regulation is founded on seven guiding principles. These are described in GDPR Article 5. A key principle is that of data minimisation (art 5(c)):

Personal data shall be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (‘data minimisation’);

What article 5 says is that data controllers may only collect the necessary data, and only the necessary data, for the given purpose and only the given purpose, and stored for only the necessary amount of time.

If respected, this will indeed help us to prevent that our digital selves continue to affect others - and ourselves - for eternity. However, all is not that simple. Do we actually know what data processing is taking place? Although companies are obliged to tell you, do they? You have a right to be informed about:

  1. The collection of data

  2. How the company plans to use the data

  3. The reason why they are collecting the data

  4. Can the purpose of collecting the data be achieved without collecting the data?

  5. How long will the data be stored to fulfill the purpose?

  6. Is the data periodically reviewed in relation to the above 5 points and deleted if required?

In accordance with article 16 in the GDPR, you also have a right to edit the personal data collected on you, and article 17, gives you additionally the right to have the data erased if the data is no longer necessary to fulfill the purpose to which they were collected.

Article 17 is key to preventing that data and data inferences live on forever. Ensuring compliance with this article is therefore really important for the union policies of the right to a long working life.


What should the union do?

This gives your union, who can be mandated to represent you (see article 80 GDPR), many possibilities. They must insure:

  1. That the workers have been informed about all data collection (points 1-6 above);

  2. Help you through “data subject access requests” exercise your right to correct the data held;

  3. From your employer a record of whether and how the company is compliant with the GDPR’s seven principles - including the one on how long data can be stored;

  4. That companies are in compliance with article 35 on data protection impact assessments (DPIA). As the processing of personal data at work is a “high-risk” operation, companies are actually obliged to consult with the workers when writing the DPIA. I have knowledge of just 2 examples of this actually happening.

Are we ok then?

No. Whilst all of the above measures can really help protect our rights, and prevent the eternal influence of out-of-date data and data profiles, we are faced with additional changes. Firstly, the geographical scope of the GDPR is established in article 3. Whilst this offers a broad protection of our rights, the world extends beyond the GDPR’s boundaries. Algorithmic systems can be trained on data from other countries and regions, and therefore indirectly influence the outcome even when the GDPR is complied with.


Secondly, the GDPR poorly defines our rights in relation to data profiles we are subject to, but that have nothing directly to do with our own personal data. As mentioned in my article in Forsikring-1 2021, unions have an important role here. Let’s imagine an automated hiring system that your company has bought from a company in the United States. Maybe this system has been trained on data and data inferences from segments of workers in the US. It is then instructed to sort applicants according to certain phrases, words, experiences, characteristics. Now what if the algorithm has “learnt” that someone with a particular education, from a particular decade and of a particular gender is most likely not to stay with the company for that long? If you are an applicant with similar characteristics, do you think you will make it to the interview? Most probably not.


This is one of the reasons why unions must demand a seat at the table in governing these algorithmic systems. You need to be in a position to ask what data the algorithm is trained on, what characteristics, words, phrases, inferences it is trained to judge as positive or negative. And you must ask: how will all of this influence the union goal of ensuring diverse and inclusive labour markets?


In conclusion

Unions simply must capacity build to conquer these important issues. Shop stewards should be trained so they can be the digital watchdogs in the workplaces so you can enjoy a long working life free from data-driven manipulations that prevent you and others from fulfilling your potential. Lastly, given that digital tools really don’t care about national boundaries, only law, unions must cooperate internationally to push governments to regulate these systems globally so all workers, no matter where they are, can enjoy the same strong rights and protections.