Christina J. Colclough
It's Not Just About You
"I've done nothing wrong, so who cares if they take my data?" If you can hear yourself think this, please keep reading. The thing is: this is not just about you. Your data says a lot about you, yes. But it can have a huge impact on the work and life opportunities of people similar to you, or the absolute opposite. We are in this together, and it's time to fight back!
I have been fortunate to address many many different workers across the world on all this about digitalisation and the future of work. When the discussions turn to the issue of worker and citizen monitoring and surveillance, many recognise that this is happening. As ctizens, when we use social media, our credit cards, GPS, search for things on the internet, or play music or watch movies via streaming services, we are essentially feeding companies with massive amounts of data. When during the day do you turn on the streaming services, where you go, where you don't go. What you typically do, and what you don't. The moment you have your smartphone in your hand or tucked away in your pocket, you are being leached for even more data.
Data Extraction at Work
As Covid-19 rips through our societies, the market for employee monitoring and surveillance software has boomed. From surveilling which websites you visit during working hours (maybe even beyond), to keyboard clicks per minute, to who you connect with via video software, to your GPS location...the list seems endless. All these surveillance tools are sold to companies with the promise to 'increase productivity and efficiency' (I have written and spoken a lot about what rights workers should have, but yet do not have, with regards to this data f.x. article here and speech here). The present focus is about how all of this data extraction can influence your life and career opportunities, but importantly also that of people similar to you, or very dissimilar to you. Sounds cryptic? Keep reading...
All of this data that gets extracted from you as you work, gets analysed by more or less elaborate digital/data systems. Oftentimes, companies hire 'People Analytics' (PA) experts - a new kind of specialist function aimed according to Heuvel & Bondarouk (2016), at "..the systematic identification and quantification of the people drivers of business outcomes."**
What People Analytics is aimed at doing is:
Measuring your activities (performance) against a standard, or norm, or statistical average
Finding correlations between your actions (and non actions) and productivity/efficiency outcomes
Predictive analysis - for example, based on prior data correlations, they will estimate what your next move or next behaviour most likely will be. Will you since you are 31 years old and married, soon have a child? Will customers "trust you" given your postcode (an indication of your social standing), your predicted accent and your gender?
Or they do simple analytics - effective work time relative to age, gender, education, skills..
In many ways, People Analytics at work is the cause of much of the wrongdoings with your data. As you can hear in my speech below given to a large number of People Analytics experts, we should regard them as the gatekeepers of the ethical or moral use of data. They should be asking: is this data inference morally defensible? Should we be measuring the link between ethnicity, postcode and productivity? They are the ones who should be pushing for strong governance policies over the data, and even stronger redlines as to who can access and even buy these datasets.
People Analytics experts do not hold the end responsibility. Executive management does. They are the ones who are asking the PAs to do what they are doing. They are also the ones who are buying in the surveillance tools. The thing is this:
"Very very few companies I have spoken to actually have governance mechanisms in places aimed at safeguarding your privacy rights, your human rights as well as the privacy and human rights of all of those affected by the inferences drawn on you."
Ban markets in human futures
Shosana Zuboff, the author of The Age of Surveillance Capitalism is calling for the banning of markets in human futures. We should echo her. These markets are turning everything we do, and everything we don't do into objects that can be fed into behaviour analysis and predictive analytics. These analyses shape and form your life and work opportunities. And they shape the opportunities of people you most likely will never meet.
Just think about your life as it has unfolded. The many chance moments, the coming together of unrelated things that opened the door for you to walk through. The ups and downs. Now consider that some of the opportunities presented to you, might never have happened had you been on the receiving end of an algorithmic inference. It might deem you unfit for a certain job, or ineligible for a mortage, or that your child is unsuitable to university because of some statistical inference you will never know about.
We must demand that markets in human futures are banned. Yes, streaming services and their recommendations are handy, but they are also manipulative. These systems are manipulative. For you. And for those who will be affected by your actions and non actions. In Tim Wu's words, we must stop and reflect upon whether we have fallen for the Tyranny of Convenience, the shortsightedness of the here and now, at the expense of all forms of morality and human decency.
Now next time you press like on social media, and don't like another post, realise that those actions are affecting you and others. Realise that your accent, your postcode, your skills, education, gender and age, are not irrelevant once you have a job. You are becoming a commodity, your personal life and your professional behaviour too. And as long as we do not stand together to emmancipate ourselves from the manipulations we are subject to, we as people, with all of our beauties, ups and downs, irrationalities and dreams will soon become irrelevant.