86 items found for ""
- Breaking the cycle: why Edtech must be regulated
Written for Education International on the International Day of Education 2021, this blog argues why unions and regulators urgently need to address the privacy invasive aspects of EdTech. Read full blog post here - excerpts below The vicious cycle: dependency on the private sector, data control and surveillance As public services get strapped for money, their dependency on the private sector will likely grow. When it comes to digital technologies, we are entering into a vicious cycle. Unless public procurement, outsourcing and public-private partnership agreements are radically changed, the private sector’s power grab will be strengthened. It is they who hold the big data and the data analytical tools. The public sector are their dependent customers. As a result, the public sector’s capacity to responsibly gather their own data and make their own analytics will either never be built or will decline. This in turn will increase their dependency on the private tech sector. We can only assume that the same vicious cycle is happening in the education sector. Educators will, like their colleagues in other sectors, become increasingly subject to the surveillance that is at the heart of all digital tools. Everything digital creates – or extracts – data. This data is combined, aggregated and turned into numerous, often opaque, probability analytics calculating the likelihood that this or that learner will succeed in mathematics, or that an educator from that area, with that gender and that age, will perform badly with large classes. These profiles, known as inferences in the tech world, will - whether we know about their existence or not - influence our personal and professional lives and the opportunities presented to us. This is why the author of “The Age of Surveillance Capitalism”, Shoshana Zuboff, is fervently calling to make markets in human futures illegal. Breaking the cycle: why we need Edtech regulations that put people before profit It is understandable if you now are thinking and feeling that this is all really bad. In many ways it is. But it does not have to be. Digital tech is not necessarily born evil. But it is not necessarily born good either. It is here that we need regulation to steer digital technology into the direction where it puts people and planet before profit. EdTech could serve very good purposes, it could reach out to learners in empowering (for them) ways. It could bring cultures and traditions together across geographical boundaries and increase our understanding of “otherness”. It could support learners in need and high performers to reach their inner potential. It could help track educators’ working time, the balance between their administrative and teaching time, and it could suggest new teaching methods and literature. To some extent digital tech does this already, but it often does so at the price of our privacy rights and human rights. The many inferences made do not disappear. A poor performing child could bear that stamp with him or her for the rest of their life. Charting the path forward: the crucial role of unions Where does all of this leave educators and unions alike? For unions to remain powerful, they must have a seat at the table in the governance and ongoing assessment of the digital technologies in place in workplaces. They should hold leadership and authorities accountable to the privacy rights and human rights impact of these tools. They should be party to an evaluation of the systems’ risk profiles – what individuals or groups intendedly or unintendedly will be disadvantaged by the algorithm? Is the tool fair, if so to whom? What trade-offs are being made, and can unions accept these? The list continues, and the details of this co-governance must be sketched out. Having co-governance structures in place will ensure that educators are included in any assessments of Edtech and heard in relation to what tools they might need (EI’s recent survey of education unions showed that this is currently not the case). Across all sectors in most countries, no such structures exist. An exception is Norway, who for 30 years now in their central framework agreement in the private sector, and previously also in the public sector, has a provision that allows for the creation of a data shop steward. This unique institution needs to be explored further, mirrored by others and its role expanded to include the co-governance of algorithmic systems. Another largely unexplored topic for unions would be to negotiate for much stronger workers’ data rights. Even within the European General Data Protection Regulation (GDPR) workers’ data rights are limited – especially with regards to the inferences we discussed above. In many other data regulation jurisdictions workers are either entirely exempted from the data protection (for example in Australia and Thailand) or as in California are exempted until 2021. Unions need to fight back to rectify this. I speak of the need to negotiate the data life cycle at work as depicted in the figure below (Note: “DPIA” stands for Data Protection Impact Assessment - Cf. GDPR, art. 35). The Data Life Cycle at Work Unions simply must build their capacity to meet the challenges of our digitalised economies and societies. This is no easy or quick task, which is evident in the fact that currently 68% of education union respondents to EI’s survey report that they do not offer courses on the governance of digital technologies. It will require that unions pool their resources, think smartly and help one another leapfrog into a more sustainable future. Union leaders, organisers, the secretariat and the staff representatives out there need to be trained so they know the ins and outs of digital technologies. With this training in place and a strategic orientation towards the digital economy, the demands unions have for decent work, safe conditions and the respect of human rights cannot be ignored. With this survey and the lessons within, EI as a whole can take important steps towards an alternative digital ethos. One that is worker-led and that puts people and planet before anything else. --
- EdTech Needs a Strong Union Response
In 2020, we worked with Education International to unravel the role of teachers' unions in the introduction of education technologies in schools and higher learning institutes. The survey covering the entire world paints a very bleak picture of digital divides, lack of consultation with teachers, failure to protect learners' and educators' privacy rights and more. Read the summary report here Download full report here:
- Digital Tools for Trade Unions
Published in 2019, this report by Dr Jonnie Penn and Dr Christina Colclough provides an overview of digital tools built for, or by, workers. It provides loads of great examples of organising and campaigning apps and services that can boost your communication, outreach and impact. We divide the tools into custom outreach tools and general outreach tools. The report is a result of our scanning of digital tools that we believe can boost union outreach to members, particularly young members. Some tools are designed by unions, others not. Some are free to use, others not. Some are hugely successful, while others have only just begun their journey. The tools we present here give a taste of what is available. New apps and services are popping up all the time. The ones we present for you in this report are neither conclusive, nor exhaustive. But they are some of the best, and the most innovative. At the heart of each tool we profile is the aspiration to harness information to boost impact. We’ve asked each development team to share lessons learned along the way. You’ll find lessons like ‘co-build with users, not just for them’ and ‘scale slowly even if the world wants you to scale fast.’ Our aim is to shine a light on how pioneering groups have merged digital tools with the spirit of collectivisation. In what follows, we also outline more than a dozen off-the-shelf digital tools (many of which are free) that you can put to use today to become more organised in planning, making budgets, making presentations, or just staying in touch with members or your team. We welcome your thoughts, feedback and experiences! Have a read! This will certainly inspire you.
- The Future of AI and Human Experience
Dr Christina J. Colclough was interviewed by LG and Element AI on the future of digital technologies, the ethics of AI and the future(s) of work. See the interview with her here: In the interview, Christina offers her opinions on why the human-centric design of artificial intelligence is crucial if we are to avoid an almost feudal future where power is concentrated in the hands of a very companies at the expense of democracy. She mentions that if we continue down the current path of an unfettered digitalisation, workers and consumers will become objects - puppets on a string - judged and manipulated. She is asked whether she is optimistic or pessimistic about the digital future and answers that the governance of technologies is crucial for building trust in digital systems and in ensuring they serve people and planet before all else. Christina presents her co-governance model for algorithmic systems and explains how it will hold developers and those deploying digital technology responsible and transparent. It crucially includes the voice of workers so management and labour together can evaluate, adjust and where necessary reject systems that have adverse outcomes or recommendations. Workers Rights When asked whether AI can serve to make workplaces more equitable, Christina responds that it is the wrong question to ask, a dangerous one even. She warns that the question implies that AI has a free will and that we mustn't naturalise that notion. She asserts that technology can be used for good, but to make it good we must - again - govern it. She asserts that technology knows no boundaries why we must talk, discuss, form and shape them so they serve the best of our interests. Union busting has no place in the digital economy. Data cooperatives Christina wraps up the interview talking about data cooperatives, or what she calls worker data collectives - and the benefits of collectivising data access and control over the current individualised notions of data rights we know today. Christina defines the governance structure of the worker data collective, the need for defining redlines and its purposes. As part of the AIX Exchange, the following people were also interviewed. See all videos in the series here Yoshua Bengio, Scientific Director of Mila and recent Turing Award winner Rodney Brooks, robotics entrepreneur who founded iRobot Corp., inventor of the Roomba robot vacuum David Foster, Head of Lyft Transit, Bikes and Scooters Dr. Yuko Harayama, who helped develop Japan’s Society 5.0 roadmap as Executive Member of Japan’s Council for Science, Technology and Innovation Charles Lee Isbell Jr., Dean of Computing at Georgia Tech Helena Leurent, Director General of global consumer rights group, Consumers International Bo Peng, Portfolio Director at renowned design agency IDEO Jeff Poggi, Co-CEO of the McIntosh Group Sri Shivananda, SVP and CTO at PayPal Dr. Max Welling, VP Technologies at Qualcomm Netherlands and research chair in Machine Learning, University of Amsterdam Alex Zafiroglu, Deputy Director at the 3A Institute (3Ai, Australian National University) Get the report The report discusses with these 12 world-leading experts in AI the themes of Ethics, Transparency, Public Perception, User Experience, Context, and Relationships. Find the report here
- STOA study includes our work
The European Parliament's Panel for the Future of Science and Technology releases a thorough and insightful report "Data subjects, digital surveillance, AI and the future of work". Our Data Lifecycle at Work recommendations are included laying out where unions should set in to fill the gaps and greatly improve workers' data rights. Abstract "This report provides an in-depth overview of the social, political and economic urgency in identifying what we call the ‘new surveillance workplace’. The report assesses the range of technologies that are being introduced to monitor, track and, ultimately, watch workers, and looks at the immense changes they imbue in several arenas. How are institutions responding to the widespread uptake of new tracking technologies in workplaces, from the office, to the contact centre, to the factory? What are the parameters to protect the privacy and other rights of workers, given the unprecedented and ever-pervasive functions of monitoring technologies? The report evidences how and where new technologies are being implemented; looks at the impact that surveillance workspaces are having on the employment relationship and on workers themselves at the psychosocial level; and outlines the social, legal and institutional frameworks within which this is occurring, across the EU and beyond, ultimately arguing that more worker representation is necessary to protect the data rights of workers." Read a few of the pages that include our work here; Download full report here:
- The Neglected Worker
Are you a worker? Experience tells me that many of you - at least for an instant - would probably have thought "no". But hang on - if you are employed, no matter at what level of expertise or education, you are a worker. You are also a worker if you are on precarious contracts (gig economy, zero-hour contracts) or you are subject to the informal economy. This means (at least still) that the vast majority of people who work are workers. Although figures vary across the world (see OECD 2020 figure below), millions and millions of us across the world are workers. The Neglected Worker Yet in many future of work discussions, fora, debates and/or panels workers are ironically and concerningly not invited to voice their thoughts. This is especially evident when the events are tech-orientated. No matter whether the topic is the Ethics of AI, data governance or even the automation of jobs, time and time again workers and/or their representatives (typically trade unions) are nowhere to be seen. Industry representatives are there, experts too, consumer organisations and maybe even government officials, but no workers. This is unacceptable and must be remedied. Here's a good story to exemplify why. I was an official trade union representative in the OECD's expert group that was tasked to draft the OECDs AI Principles. They were adopted more or less as drafted by the OECD Council in May 2019. I was asked to address the Council in an 4 minute brief. Having listened to the many praises from other speakers, I took the floor and said: Congratulations! You have now taken an important and necessary step towards making digital technology serve people and planet. Now you must take the next step and put practice to principle. On the principle of fairness you must ask: "Fair for whom". What's fair for the employers, is not necessarily fair for the workers. What's fair for men, is not necessarily far for women. We need to talk - dialogue is key for a fair digital future". Several of the politicians nodded, took the question, mumbled it. Fair for whom? The same goes for all the other principles. Accountable to whom? Explainable to whom? We can never achieve the fine principles if solutions are unilaterally sought, or if they neglect the voices of those they are all about: in the case of work, the workers. I have numerous examples of events that have taken place recently that neglected the voice of the workers: UNESCO discussing their draft AI Principles and Education with EdTech companies yet no teachers and no teachers' unions. Or the EU roundtable on AI and the Rule of Law - again no workers or their unions on the speakers' list. GPAI's Future of Work Working Group that includes just one union person. Or their Data Governance Working Group with no workers, and as of yet no focus on data governance at work. Yet the work I and others do, which can be read across this website, highlights how workers are becoming commodities subject increasingly to data inferences and probability analyses, and how workers' data rights are poorly defined or even excluded from government regulations across the world. We simply cannot continue to neglect neither the workers' voice, nor going to the core of digitalisation at work: to inferences, data extraction and the lack of (co)-governance of algorithmic systems. What Needs Doing Whilst it is evident that this must change, we also need to look inward and ask how our own self-understanding is effecting the neglect of workers. Are you a worker? It's time all those in employment, in contractual relations in the labour market, and all those subject to the informal economy, realise that we are workers. All of us regardless of hierarchical position, education or other forms of privilege. We should be joining forces, collectivising our responses and demanding unequivocally a seat at the digital negotiation and governance table. Change can only come about if we free ourselves from the illusion that we somehow are different from the workers out there. Or, more directly, that workers are those who do the jobs we would rather not do. This is an misconception that has been allowed to manifest itself and has fragmented us from one another. Work is work. No matter how it is conducted and under which contractual forms (or the lack of), work is work and it is performed by workers. By you. By me.
- 5 App-Design Lessons
co-authored with Dr Jonnie Penn Dr Jonnie Penn and I worked together with open source app developers: Guardian Project and their Design Team OKThanks to develop WeClock - our privacy-preserving selftracking app for workers. Here is a run down of our top 5 lessons for inspiration if you are considering building an app. Technology is a means, not an end Technology is not a silver bullet. To make effective use of digital technologies, you have to think of how they will help you reach a predetermined end, like better organizing, recruiting, or collective bargaining. Technology is a means to improve your existing practices, it is not an end in itself. Co-Design with your Users (or: Beware the Super User Fallacy!) Picture this: you sink precious time and money into, say, developing a new smartphone app to drive recruitment. You devote months into making it perfect. Upon launch, you learn, to much frustration, that no one wants to use an app in the first place! Even if it’s the perfect app, it’s just not what your users want. This is an easy mistake to make. It is known by some as the Superuser Fallacy. If you design your tools for the 1% of users who do exactly as you hoped they would, clicking every button and sign-up, you actually risk alienating the other 99% of users who, often, just don’t care. To avoid alienating the everyday user, one solution is to co-design your project with your intended users. Ask them what they want and see how you can provide that service for them. The learning process may surprise you! - it did us. Seek out your Minimum Loveable Product To save time and money, one useful idea we’ve kept in our back pocket is the notion of a Minimal Viable Product (MVP). The idea goes like this: when developing a new tool, start by considering what the lowest possible implementation level of your idea would look like. Example: do you need an app? Could a simple website or landing page solve your problem instead? Or does an off-the-shelf tool exist already that could solve your problem? (The answer, for us, was yes when it came to staging a Virtual Town Hall. Learn more about that here.) The idea goes one step further when you consider a Minimum Loveable Product (MLP). This idea improves on MVP by adding to it that your offering should be engaging. So, rather than just considering the lowest possible implementation level of your idea, ask also what would need to be true in order for your idea to get people talking and feeling excited about the product or service. Thinking in this direction helps you design tools that people actually want to use. Once you have a MVP or MLP ready, you can incorporate user feedback to improve, and expand, your offering. Find the Purpose for your Service Before you Start to Build, Not After They say that when you hear the same thing from three people you trust, it’s time to start listening. In our case, a piece of feedback we heard again and again in the early stages of this project was that when you design a smartphone app or a digital tool to gather data about something (say, for example, an app that lets workers report unsafe conditions on a worksite), you must avoid collecting all the possible streams of data available just because they’re there and just because you can. This is not what Facebook, Google, or Apple do. Those companies suck up every piece of data you generate, from sleep patterns to when you use the bathroom. To set a better standard, we’ve learned to outline our mission clearly before we start to build anything. This makes it easy to decide what type of data to collect (ex. photos from the worksite) and what not to (ex. your GPS, recent calls, or the frequency with which you check your phone). Beware Mission Creep The astronauts who walked on the moon would likely have loved to walk on Mars too, but that level of ambition would have derailed both projects. Technology allows for so many potential possibilities that is easy and intoxicating to want to try them all. Again and again, we have been told to narrow our ambitions to a reasonable set list of goals. This avoids what is called ‘mission creep,’ meaning when your project grows in scope without end, until you and your team are flooded with new features and ambitions that end up compromising your core goals. Ambition is good! You just need to keep an eye on it...
- A Future of Unsustainable Work?
Article written for the Danish Insurance Sector Union and published in Danish here. The below is the original English language article Working from home has it’s obvious advantages. You get to skip the long commute times, you can organise your day more freely, you can juggle more things at once, like putting the washing over and hang it up. It also has its disadvantages: you can feel lonely, isolated, or stressed by not having a clear boundary between work life and private life. As a wise woman said to me recently: We are not working from home, we are living at work. COVID-19 has forced us to adapt, think differently and cope. Companies are also adapting. Not least spurred by reports from employees that working from home makes them more productive, many have now decided to let their employees work permanently from home. Facebook, Dropbox, some parts of Google are here taking the lead. Some banks too - allowing traders to trade from home - albeit under a complex monitoring and surveillance system. The advantages for companies are many: an obvious one being that they can substantially reduce their expensive office space. Whilst the here and now is working for the majority, we must dare look into the glass ball and ask what the long-term consequences of remote work might be on work, and on our contracts and collective agreements. The Hybrid Company Let’s dive into it. Have you noticed that the words “hybrid-work” or “hybrid-companies” are already creeping into the daily press? Hybrid work relates to a mix of work forms - working from home, or remotely, to working on location. The hybrid company is one that exists virtually and to a limited degree physically. Here office spaces are vastly reduced, probably decentralised to smaller hubs scattered across the country and/or the world. In this future, workers won’t have a choice as to whether they want to work from home. You will be forced to. The thing is: who are your colleagues? And where are they actually? Nothing prevents a hybrid company from hiring remote workers from entirely different parts of the world. A job is a job and tasks need to be fulfilled. The internet sets no geographical boundaries. Your colleagues might well be in India, Latin America, the Philippines or Svendborg. For the company it doesn’t matter as long as the job gets done. With A.I. driven translation software, even language boundaries become less important. The Rise of Precarious Work Assuming this is a viable future, we then simply must ask what this will mean for our employment contracts? Why would a company continue to offer permanent, open-ended contracts to their workers? Jobs can be broken down into tasks. These tasks can then be put out there on a global labour market and given to whoever the company sees is best qualified. We see this already happening in the rising number of bogus self-employed workers - not least in the gig economy. So you will be left to compete on speed, qualifications and not least price against workers from all corners of the world. It isn’t hard to imagine what implications this will have on wage levels. We simply risk a race to the bottom that will put workers in more expensive parts of the world, you for example, at a huge disadvantage. In a labour market of precarious work, former colleagues will become competitors. You will be pitched out against one another as you bid in on the tasks or projects available. The individualisation of work will be complete. Imagine what this will mean for our mental health, our wellbeing? Our ability to sustain ourselves? Now, I understand if you are slightly spooked at this point, and I also understand if you are thinking that your work is too important for it to be subject to this kind of a future. In a not so distant past, you might well have been right. But the rise of digital tools and their exponential growth will facilitate, for the vast majority of jobs, the transition to a boundaryless, global labour market. On a side note, I need to mention that in current digital trade negotiations, one of the proposals is to remove the requirement that companies have a physical and therefore legal presence in a country in order to sell their services there. If these negotiations succeed, the door is left wide open for, in your case, insurance companies from anywhere in the world to sell insurances in Denmark. This is a first major opening for the establishment of virtual companies. Which in turn will open up for a truly global labour market.. and before we know of it, the glass ball scenario has become real. The union response Only strong trade union action can prevent this future scenario from becoming a reality. Firstly and immediately, shop stewards and trade unions must pay careful attention to what companies are saying about remote work, and on which hires they do, and under which contractual relations. Secondly, trade unions nationally and internationally must get engaged in these digital trade negotiations and speak to their governments about them. In your case, the EU has the negotiation mandate on behalf of member states. Is the Danish government fully aware of the consequences of these negotiations? Thirdly, national unions must demand that their global federations continue to engage in the United Nations Guiding Principles on Business and Human Rights (UNGPs) to ensure that all workers across the world have decent work and wage levels. Fourthly, we should all recognise that work is work, and all workers should have the same social and fundamental rights. The rise of the gig economy and other forms of precarious work has been facilitated by out-of-date social protection regimes that mean it is an economic advantage for companies to make work more precarious. This has to stop. Fiftly, we need to discuss competition and innovation with the companies. Much research - my own included - proves that strong employer-employee relations, high levels of trust and dialogue facilitate innovation and learning. If work gets broken into pieces and workers get individualised, what will happen to the companies’ ability to adapt and change? Whilst the hybrid company might be a money-saving model, it could also well be the beginning of the end for many companies. And lastly, we need to talk about taxation and push for new models for corporate taxation in a world of no boundaries and no physical presence. This too should be a top priority for your government. Whilst our glass ball scenario here can be seen as overly negative and dramatic, it would be wise not to disregard it. We already have indications that it is a viable future, maybe not tomorrow, nor in a year. But soon. Download the article in Danish by clicking on the image
- New Report: Teaching With Tech
Written for the global trade union federation for teachers' unions across the world, Education International, this report sheds light on the digitalisation of education. Strikingly, teachers' training needs are poorly met, digital divides are growing leaving the already disadvantaged in even more precarious situations and teachers and their unions are not involved in the assessment of digital technologies. EdTech is a fast growing industry - with all that that entails of a private-sector power grab into the sector. But where does this leave the human rights and privacy rights of educators and learners alike? Who has the responsibility to check whether these digital tools are exacerbating or bridging inequalities? Are they reaching out to rich areas or poor, urban environments or rural? Are educators with their wealth of knowledge, pedagogy and emotions involved in the assessment of these technologies and their impact on learners? Will educators’ jobs change? Become more intensified, demanding? Digital technologies are not born evil. They are not born good either. It is up to those designing, deploying, and governing them to ensure they are put to a fair, inclusive use. The survey conducted in June, July and August of 2020 sheds light on the challenges of digitalising education. Download the full report below. Read online by clicking on the image above
- Labour - a Commodity?
In 1919 and again in 1944 world leaders agreed that labour is not a commodity. Yet today, as data is extracted from workers and they are continiously profiled, labour is being turned into a commodity. An object. We are all becoming a bundle of data points, of statistics and of probability analysis. It has to stop. First some history. In 1919, as part of the Treaty of Versailles that ended World War I, the International Labour Organisation (the ILO) was born out of the belief that universal and lasting peace can only be accomplished if it is based on social justice. Article 427 in the Treaty states: Again in 1944, in the ILO Declaration of Philidelphia, this article was reaffirmed. Article 1(a) states: (a) labour is not a commodity; Now let's fast forward to today. Workers are subjected to digital surveillance and monitoring in various forms. From location tracking, to CCTV, to systems that measure how fast that tap on the keyboard to screen and image captures to check what they are doing, and indeed whether they are sitting in front of the PC or not. Microsoft's Office 365 that turns on the dashboard for employee monitoring by default (see screenshot from https://twitter.com/WolfieChristl below) or Amazon's surveillance of workers and their engangement with union busters Pinkerton in Europe, as reported in Vice on November 23: Internal emails sent to Amazon's Global Security Operations Center obtained by Motherboard reveal that all the division's team members around the world receive updates on labor organizing activities at warehouses that include the exact date, time, location, the source who reported the action, the number of participants at an event (and in some cases a turnout rate of those expected to participate in a labor action), and a description of what happened, such as a "strike" or "the distribution of leaflets." Other documents reveal that Amazon intelligence analysts keep close tabs on how many warehouse workers attend union meetings; specific worker dissatisfactions with warehouse conditions, such as excessive workloads; and cases of warehouse-worker theft, from a bottle of tequila to $15,000 worth of smart watches. Monitoring = data All of these surveillance and monitoring tools extract and create data and data profiles (aka inferences). As reported in previous blog posts we cannot escape this data extraction. it is often hidden from us, and it offers instant feedback to the person/organisation doing the monitoring. The profiles are used for all sorts of probability analyses aimed at predicting our behavior or for efficiency and productivity measuring. Whatever reason a company might have for doing all of these calculations, doesn't remove the fact that they are turning labour into sets of data points, into calculations. Ultimately into objects void of personality, fate, emotions and chance. It's got to stop This cannot be accepted. It has to stop. Let's prove Hegel wrong when he famously said: "The Only Thing We Learn From History Is That We Learn Nothing From History" The ILO was born out of the realisation of the importance of social justice for world peace. There is nothing just about removing workers' autonomy to form and shape their careers and life free from the manipulations of opaque algorithms. We must not accept that a worker never sees a job annoucement because a string of private-company held probability algorithms have deemed that worker unsuitable for a job. If we accept this, we accept our objectification. And we hand the world's tech companies the ultimate control over our lives, our democracies and our fate. We must demand that our politicians read the writing on the will and take immediate action: the markets in human futures must be banned.
- AI as a Tool for Worker Empowerment?
Streamed live on November 11, 2020, this conversation on the futures of work and workers is a follow up on the AI & Equality Initiative's first webinar on the AI and the future of work. Carnegie-Uehiro Fellow Wendell Wallach and Dr. Christina J. Colclough, founder of The Why Not Lab, build on that discussion with a conversation about the future of the worker. Dr. Colclough and The Why Not Lab are leading the way in advising government and industries that will be changed by AI on how to best use that technology to empower workers, and advocate for progressive strategies, approaches, and policies that do so.
- Towards Workers' Data Collectives
In this essay written for The Just Net Coalition and IT for Change's Digital New Deal essay series, I roll out my vision for the establishment of workers' collective data rights and ultimately for workers' data collectives. A means through which to empower workers and balance out the power asymmetry so prevalent in today's digital capitalism. I caution that fixing data and privacy rights is not an end in itself. We will need to draw a new map for the digital economy and society. We will need to demand from our politicians that they think big – constructively. The current exploitation by Big Tech is not a fad. It won’t go away unless forced to by law. The vision outlined in this essay, is neither utopian nor unattainable. But it will require responsible and dedicated actions on our side. Now. Excerpts below, full online essay here, download link below. The commodification of workers as a consequence of increased digital monitoring and surveillance is well underway. Through advanced predictive analytics, work and workers across the world are becoming datafied to the detriment of fundamental, human, and workers’ rights. This essay argues that trade unions must expand their services to include collective control over workers’ and work data through the formation of what I term Workers’ Data Collectives. However, to do so, unions urgently need to address regulatory gaps and negotiate for much improved workers’ data rights in companies and organizations. Without these two goals for the collectivization of data and an alternative digital ethos backed by new regulatory institutions, I argue, union and worker power will be significantly diminished leading to irreparable power asymmetries in the world of work. ... ... We have established that workers’ data is gathered and generated by companies, and that these data can be used in corporate decision-making, and transferred, sold, or used by third parties. We have also discussed that these data can directly influence your work and career prospects, and affect workers like you. Yet, as a worker, you have few, if any, rights in relation to these data and how they are used. The power asymmetry is thus growing between you and the companies which seem to know or infer information about you that can directly affect your life. For workers to maintain any control over their working lives, this power divide needs to be bridged. But we need to go further and ask: what if workers themselves controlled workplace data, drew insights from them, and used them to campaign for better working conditions, inclusive and diverse labor markets, fundamental rights, and new laws? .... .... The benefits of collectivizing data In the above, we have established a two-step process towards empowering workers across the world in the digital economy. We need stronger workers’ rights to data and sound structures that will allow us to collectivize that data. To realize these benefits, behavioral, legal, and technical changes will need to be made. We will need to overcome our own lethargy, form new habits, establish new laws and new authorities at the national and global level. We will need new governance structures, technological solutions for secure data portability, and conscious choices about which collectives we will entrust with our data. These are daunting requirements. So what are the benefits? To begin with, this will allow us to create an alternative digital economy where data is regarded as an infrastructure similar to roads, railway lines, water supplies, and energy. We will vastly reduce Big Tech’s control over our minds, emotions, actions – past and future. We might well succeed in actualizing Shoshana Zuboff’s demand that human futures markets be made illegal. We will ensure that information that is ours becomes responsibly useful to us. Trade unions across the world will get an additional and timely purpose, and we could expect greater mobilization towards this. We will undo the colonizing effects of the current e-commerce discussions and the skewed digital hegemonies and support, not hinder, the development of empowering digital transformations. On a more practical level, we will pool resources such that we actually have access to persons with the skills and knowledge to protect our data on the one hand, and analyze it to our benefit, on the other. Digital storytelling and visualizations are a powerful means to campaign for change. At the MIT Media Lab, Dan Callaci analyzed and compared data from a WeClock trail in New York to show how often, on the same day, workers were within six feet of one another (see figure below). Used in the context of the pandemic, this could show the relative risk for workers at the workplace. The benefits do not stop there. The Workers’ Data Collective, like Driver’s Seat, could be used to test and challenge corporate algorithms. It will empower us as individuals and communities if we know who has our data and for what purpose(s). The data collective could democratize the digital economy and empower workers to form and shape the world of work, advocate for regulatory change, and find remedies for persistent injustices. This will allow us to stop being “users” of digital technology, steered, controlled, and manipulated by algorithms and, instead, reclaim our humanity. This includes, not least, our human rights, our freedom of association, assembly, expression, thought, belief, and opinion. Many data protection regulations across the world, even those aimed exclusively at consumers, are weak. We must fight for a digital ethos that is responsible and puts our rights above profit-seeking surveillance tools and predictive analytics. In the world of work, unions must be the guardians of this alternative ethos, and themselves become stewards of good data governance. Here an ILO Convention advocating for workers’ data rights will not only be an act of solidarity with workers in weaker institutional environments, but also a necessary step to prevent digital colonialism. By negotiating the data life cycle, unions and workers will become much more familiar with, and insightful about, the potentials and challenges as well as the power of digital tools. For our ultimate aim of creating worker-owned and run Data Collectives, unions need to embrace this learning. Existing power asymmetries will only widen if workers and their unions do not build capacity in the fields of data, algorithmic systems, and their governance. By negotiating the data life cycle, unions and workers will become much more familiar with, and insightful about, the potentials and challenges as well as the power of digital tools. For our ultimate aim of creating worker-owned and run data collectives, unions need to embrace this learning. Unions must work together smartly to build capacity. Furthermore, by extending these rights across all parts of the value or supply chain, all workers and countries will be able to develop their own digital transformations without a priori being stripped of the ability to localize their data due to trade agreements. ... ... Download full essay here
- It's Not Just About You
"I've done nothing wrong, so who cares if they take my data?" If you can hear yourself think this, please keep reading. The thing is: this is not just about you. Your data says a lot about you, yes. But it can have a huge impact on the work and life opportunities of people similar to you, or the absolute opposite. We are in this together, and it's time to fight back! I have been fortunate to address many many different workers across the world on all this about digitalisation and the future of work. When the discussions turn to the issue of worker and citizen monitoring and surveillance, many recognise that this is happening. As ctizens, when we use social media, our credit cards, GPS, search for things on the internet, or play music or watch movies via streaming services, we are essentially feeding companies with massive amounts of data. When during the day do you turn on the streaming services, where you go, where you don't go. What you typically do, and what you don't. The moment you have your smartphone in your hand or tucked away in your pocket, you are being leached for even more data. Data Extraction at Work As Covid-19 rips through our societies, the market for employee monitoring and surveillance software has boomed. From surveilling which websites you visit during working hours (maybe even beyond), to keyboard clicks per minute, to who you connect with via video software, to your GPS location...the list seems endless. All these surveillance tools are sold to companies with the promise to 'increase productivity and efficiency' (I have written and spoken a lot about what rights workers should have, but yet do not have, with regards to this data f.x. article here and speech here). The present focus is about how all of this data extraction can influence your life and career opportunities, but importantly also that of people similar to you, or very dissimilar to you. Sounds cryptic? Keep reading... People Analytics All of this data that gets extracted from you as you work, gets analysed by more or less elaborate digital/data systems. Oftentimes, companies hire 'People Analytics' (PA) experts - a new kind of specialist function aimed according to Heuvel & Bondarouk (2016), at "..the systematic identification and quantification of the people drivers of business outcomes."** What People Analytics is aimed at doing is: Measuring your activities (performance) against a standard, or norm, or statistical average Finding correlations between your actions (and non actions) and productivity/efficiency outcomes Predictive analysis - for example, based on prior data correlations, they will estimate what your next move or next behaviour most likely will be. Will you since you are 31 years old and married, soon have a child? Will customers "trust you" given your postcode (an indication of your social standing), your predicted accent and your gender? Or they do simple analytics - effective work time relative to age, gender, education, skills.. In many ways, People Analytics at work is the cause of much of the wrongdoings with your data. As you can hear in my speech below given to a large number of People Analytics experts, we should regard them as the gatekeepers of the ethical or moral use of data. They should be asking: is this data inference morally defensible? Should we be measuring the link between ethnicity, postcode and productivity? They are the ones who should be pushing for strong governance policies over the data, and even stronger redlines as to who can access and even buy these datasets. People Analytics experts do not hold the end responsibility. Executive management does. They are the ones who are asking the PAs to do what they are doing. They are also the ones who are buying in the surveillance tools. The thing is this: "Very very few companies I have spoken to actually have governance mechanisms in places aimed at safeguarding your privacy rights, your human rights as well as the privacy and human rights of all of those affected by the inferences drawn on you." Ban markets in human futures Shosana Zuboff, the author of The Age of Surveillance Capitalism is calling for the banning of markets in human futures. We should echo her. These markets are turning everything we do, and everything we don't do into objects that can be fed into behaviour analysis and predictive analytics. These analyses shape and form your life and work opportunities. And they shape the opportunities of people you most likely will never meet. Just think about your life as it has unfolded. The many chance moments, the coming together of unrelated things that opened the door for you to walk through. The ups and downs. Now consider that some of the opportunities presented to you, might never have happened had you been on the receiving end of an algorithmic inference. It might deem you unfit for a certain job, or ineligible for a mortage, or that your child is unsuitable to university because of some statistical inference you will never know about. We must demand that markets in human futures are banned. Yes, streaming services and their recommendations are handy, but they are also manipulative. These systems are manipulative. For you. And for those who will be affected by your actions and non actions. In Tim Wu's words, we must stop and reflect upon whether we have fallen for the Tyranny of Convenience, the shortsightedness of the here and now, at the expense of all forms of morality and human decency. Now next time you press like on social media, and don't like another post, realise that those actions are affecting you and others. Realise that your accent, your postcode, your skills, education, gender and age, are not irrelevant once you have a job. You are becoming a commodity, your personal life and your professional behaviour too. And as long as we do not stand together to emmancipate ourselves from the manipulations we are subject to, we as people, with all of our beauties, ups and downs, irrationalities and dreams will soon become irrelevant. ** https://ris.utwente.nl/ws/portalfiles/portal/13277560/Van+den+Heuvel+Bondarouk+2016+HRIC+Sidney+-+Metis.pdf
- Putting Practice to Principle: Why AI at Work Needs Co-Governing
Speech held for PAFOW on October 5 2020 for hundreds of experts on People Analytics. A deep dive into why we must avoid the (re)commodification of workers, and push for a much more responsible approach to data and human rights. Topics covered: co-governance of AI, data rights, bias & discrimination
- Future Digital Labour Markets
Honoured to have been asked to write an article for the Danish Insurance Sector Union's member magazine. Featured in a special section on the Future of Jobs. The article in Danish (available here) discusses how all of our actions - and importantly all of our non-actions - are extracted as data, churned and analysed. These inferences (profiles) are then sold to whoever wishes to target us with particular goods or services. But that's not it: The inferences and the complex probability analyses can affect our life and career chances too. How should unions respond? Unions in their quest to ensure diverse and inclusive labour markets, must capacity build and create the institutiuons and structures so they can negotiate the Data Life Cycle at Work and be strong partners in the co-goverance of algorithmic systems. We risk that you will not see a job advertisement as an algorithm has determined that you are not suitable for the job. Or, a person like you, from the same postcode, with the same health characteristics, socio-ecoonomic background and say education won't get promoted as it turned out you were not so much a people person? Its not just about you What we need to realise is, that as bad as it is that these inferences can influence our own lives, they can influence persons similar to us - or the opposite, those who are least similar to us. If you are a great worker, productive, acheiving above targets, will a company using an automated hiring system ever dare - or even get the chance - to hire someone your opposite? We need to ask these tough questions. Unions need a response - urgently. As the guardians of decent work, or what I term Rewarding Work, they are the only ones who can fight for the missing rights, and the lacking co-governance! Read more on workers' data rights here And on a model for co-governance of algorithmic systems here
- Re.war.ding Work
Service Union United PAM hosted a workshop at the Imagine Untitled festival in Helsinki. Under the heading of the Future of Rewarding Work, PAM has committed to a 10-year union revitalising journey. As co-founders of the Imagine Untitled festival, PAM is taking serious strides to reimagine the role of trade unions in a world that rewards the collective good. I am honored and proud to have offered my ideas and support along the way! At their maxed out online workshop on September 17 participants were asked to role play in groups and reflect on what rewarding work would be each of their characters. The Future of Unions In the second breakout session, the groups were asked to discuss the role of unions in ensuring a world of Rewarding Work for all. What should unions be and do which they aren't and don't today? Next steps PAM will over the next 10 years unpack, build and renew to meet the dreams and aspirations of workers. They took an important first step at the Untitled Festival - a welcomed opportunity to throw all balls in the air and imagine a different, more sustainable future. Hear an extract of the audio that kicked the workshop off above. Listen to the full audio introduction here: See transcript of audio here: It has been a pleasure working with the team at PAM. I am so thrilled of their courage to face the future with an open mind.