top of page

86 items found for ""

  • Turning the Tides

    In this October 2021 speech for the Nordic Financial Unions, I lay out the current harms and impacts that workers are suffering due to algorithmic systems, and what unions could, and maybe should, do to turn the tides. In the speech I suggest a number of recommendations, here are a few: Beware of "over-trust" Transparency requirements - All workers must have the right to know what systems employers plan to use, or use. Use the GDPR art 35 (DPIAs) Dialogue and consultation - No impact assessment nor evaluation can be complete without the joint voices of employers and workers Collective Bargaining - Must include the redlines, limitations, and agreed purposes of A.I systems and the underlying data/inferences. Establish clear employer responsibility - Including mitigation, liability and redress. See the video for all of the recommendations!

  • Your Digital Union

    Trade Unions are facing a number of challenges caused by the increasing digitalisation of work and workers. Some of these challenges are political and strategic in nature. Others relate to unions' internal operations. In this article written for the Danish Insurance Union, Forsikringsforbundet and available in Danish and English, I lay out what the future digitalised union looks like, what it does and why. Download the article in Danish: Original version in English Your Digital Union At an event held by the International Labour Organisation just before the pandemic, a prominent speaker said that trade unions have no future. Their heyday has passed, and workers no longer will need, nor want, to collectivise, he claimed. Let’s take the scepticism, jump 10 years into the future and ask: what is the future of trade unions? What services will they need to offer? And what will members want and need? Today's digital challenges The Why Not Lab firmly believes that trade unions have an important role to play in protecting workers' rights in the digital age. However, there is a sense of urgency around us - unions must very soon commit to reshaping digitalisation, to negotiating for much stronger workers’ rights in relation to the massive influx of digital technologies into workplaces, and to pushing for a seat at the table in the governance of these technologies. Unions now, as in the future, will need to mitigate the harms that these technologies can and are inflicting on workers. Workers across the world are reporting how they are being negatively impacted by workplace digital tools systems. These are: Work intensification: working time increases and increases in the pace of work Discrimination/bias in who gets an opportunity, who is denied --> risk of moving towards a narrow, exclusive labour market Mental health & physical health pressures Deskilling and job loss - precarious work forms on the rise Lower wages, economic insecurity, less mobility Suppression of organising Loss of autonomy and dignity Loss of privacy Classic challenges A closer look at these harms shows that they are really core union issues. The difference is the means through which these harms are inflicted. Rather than coming directly from incompetent or otherwise bad employers, they are coming from employers who in the name of efficiency and/or productivity are deploying new digital technologies. Many workplace technologies and tools such as productivity and efficiency monitoring systems, automated scheduling, automated hiring or firing tools, worker location tracking, sensors, can be grouped under the heading of Algorithmic Management. Furthermore, and to make things more complicated, many of these digital technologies are owned by, or developed by, 3rd party companies typically from other countries then where they are deployed. Oftentimes they are not adapted to the culture, norms, and traditions of the deploying country, nor its labour market. And certainly not to collective bargaining traditions. To safeguard workers’ rights trade unions must hold management accountable for the systems they introduce in the workplace. This is a pertinent issue. In much of our work it has become evident that the complexities of these technologies has caused what we call 'managerial fuzz'. Who amongst management is actually in control of these digital tools, how they operate, and who has the duty to remedy harms? Who understand their inner workings and who is responsible for governing them? In this regard, unions today, as in the future, must ask management some important questions, including: Who holds the data about the workers and does the analyses? Under which jurisdiction are they? How are workers’ data rights being respected? Can the vendor/developer repurpose the data and sell it? Does the deploying company (the employer) have the right to change the algorithm and to mitigate the harms? If not, what should be tje take down procedure? What are the workers’ rights to agree to, block or amend the data extraction, the algorithmic inferences and define the purposes/limitations to the use of the tool and its insights? Specifically, trade unions and shop stewards need to build a comprehensive set of digital competences in order to successfully assess and negotiate on these issues. Let's now skip ten years into the future and look at how a union in 2030 handles these matters. Being digital - strategy and policies Here in 2030, there is still a great focus on working time, the mental and physical work environment, competencies and health and safety. But our union is also working to protect and develop workers' digital rights – for example, the right to rest, collective data rights, the right to be free from algorithmic manipulation, privacy rights and the right to control their digital identities. The union has developed a set of demands to ensure these rights, which they actively and successfully include in collective agreements and legislation. These demands oblige companies to be transparent about the digital systems they use, and for what purposes. Thanks to the trade union movement, employers are now obliged by law to negotiate with shop stewards on the use of and control over digital systems. Our trade union 2030 has a firm seat at the digital negotiating table. In order to ensure that trade unions can be as strong as possible in digital negotiations they train a specialised cohort of shop stewards: 'DigiReps'. The DigiReps continuously oversee the use of digital technologies in workplaces. They hold management responsible for how technologies are used, for what purposes and safeguard workers’ rights by demanding what digital tools can and cannot be used for. The DigiReps are the key resources for negotiating digital clauses into collective agreements. The possibility to be trained as a DigiRep has crucially captured the interest of young trade union members. The union is growing! Digital tools In 2030, unions are successfully tapping into the potentials of digital technologies and using responsible and privacy-preserving tools to gather information and insights. By getting data through responsible ways and analysing it, unions are now issuing campaigns, information and stories of the realities of the digitalised world of work. Politicians and the public, employers and markets are hearing new versions of reality: that of the workers: From digitalised messages, to electronic billboards, to tailormade news stories, and on to successful collective bargaining by the DigiReps, unions are breaking the ‘monopolisation of truth’ driven previously and exclusively by the companies who held the data. Their glossy, one-sided version of reality has been shattered. Solo self-employed are included The union is fully using its digital competencies to reach out to, and be relevant for, the growing number of contract and solo-self employed workers. By offering them a digital hub, tailor-made information and the means through which to participate in union democracy remotely, these workers feel heard and seen. They are no longer “ghost workers” hidden from the public eye and policies. Their working time, working conditions and their rights are widely known and collectively negotiated. Social policies have been changed to ensure no worker falls between the cracks with no rights and protection. The Physical Union More and more work takes place remotely, which can be from home, or from cooperative work spaces. To meet this reality, unions have created secure online meeting spaces, which are available to all members. But they have also built physical workspaces. Scattered across the country these secure workspaces offer workers the opportunity to meet, to work, to hold meetings and arrange events. Unions 2030 are far from purely digital - they become the modern-day local town hall breaking isolation by offering social spaces. Topically, unions have campaigned on the Right to Rest - a direct response to the harms of the ‘always on’ digital reality of previous times. A much more balanced relation between work life and private life has been re-established. Digital Union To do all of the above, our union 2030 has undergone significant internal changes. All staff have been trained so they are aware of the potentials and pitfalls of digital technologies. Unions have mainstreamed across the organisation how to critically tap into the use of responsible digital technologies in organising, campaigning, member services and policy advocacy. The union movement has moved away from using digital systems in the cloud owned by large multinational corporations and instead have built their own systems around the decentralised web. Here unions nationally and internationally have built an ecosystem of protected servers that enable safe file storage, email systems and communication tools that ensure no private company has access to their information. This in turn has curbed union busting by removing corporate knowledge of union actions and strategies. Unions 2030 have jointly developed digital tools and systems that have their members’ privacy at heart. By avoiding third party data snooping and protecting members’ rights, the unions are showing the world that the convenience of digital technologies doesn’t have to come with a price: your integrity and privacy. Data analysts who specialise in analysing workers’ data in combination with other sustainably sourced data are widely available to the union movement. By having these competencies at hand, unions can turn data into information and from there into new knowledge. This knowledge is used by the DigiReps, the union leaders and organisers to swiftly and persuasively campaign for the workers’ benefit. Membership data is stored securely in tailormade Membership Relationship Management Systems. These systems are used to ensure that members get the information they want and need. The unions 2030 have strict data governance policies in place with regards to what data is stored, for how long, who has access to it, and how it is secured. By building these democratically governed worker data collectives, unions can truly benefit from the collective insights in the shared data from members. Unions have established secure systems for sharing these insights with one another without sharing the individual datasets themselves. A relevant union Members can instantaneously seek answers to questions about their rights and collective agreements through the union 2030 service bot that also can connect them swiftly to expert staff or a DigiRep. The union is present and accessible. Safe whistleblowing systems are in place that offer real time channels for members to report on workplace issues. In 2030, there will be fewer unions than today. Unions have merged so they can avoid individual and costly transformations and can scale their operations and policies. The systems mentioned above ensure that members receive sector and occupation specific services and information. The union is still tailormade to the members' individual needs. The future of trade unions The trade union movement certainly has a future. But it will take a lot of changes - both in terms of what the trade union movement does - and how they do it. Workers in 2030 have broken the illusion their peers were sold that digital technologies are emancipatory and equal for all. As the harms of former digital technologies were felt, and as the promise of freedom and flexibility of individualised contracts were never met, workers' resistance has grown. They have felt and realised that changes require cooperation and collectivisation. Our unions 2030 are operationally and strategically geared to meet this resistance. The labour market of 2030 is no longer naive about digital technologies. Rather it regulates and frames them to put people and planet before corporate profit. Bridging the present day to our 2030 scenario will, though, require battles and changes that innovative and courageous union leaders must dare take. If not, the statement that unions have no future that we opened this article with, will most likely become true.

  • Digitalisation: A Union Action Guide

    - For Public Services, Work and Workers Written for the global union Public Services International, this report sets out the issues the public services unions face as public services and work become digitalised. The report provides a snapshot of the key digital developments and discussions within international organisations, political bodies and amongst leading experts that are relevant to the core political and thematic work of unions, particularly those with members in public services. While it was written primarily for the affiliated unions of Public Services International, its core learnings and strategies have relevance for the wider labour movement. Grouped under eleven different headings, the report offers a critical overview of topical priorities and selected literature. Throughout the report the focus is on 3 key areas: the direct effects of digital technologies on public service workers how the public sector can, and should, govern data and algorithmic systems to ensure Quality Public Services how workers’ data rights and privacy rights must be improved through negotiating much stronger data rights. Each section ends with a list of areas of exploration for unions. These recommendations seek to bridge the gaps, and empower unions and workers, but also public services as a whole as digitalisation is infused into public sector work. Table of Contents Read the full report on PSI's website:

  • Podcast: The Rise of Robo Bosses

    Feet up and have a listen to this "Reasons to be Cheerful" podcast by Ed Miliband and Geoff Lloyd on the "Rise of Robo-Bosses: reigning in algorithmic management" Advances in technology are enabling new ways to monitor and manage people at work. How can we ensure workers don’t lose out from the rise of ‘algorithmic management’? Future of work expert Beth Gutelius tells us about a Californian law cracking down on issues in the warehouse industry. Then Anna Thomas from the Institute for the Future of Work and Mary Towers from the TUC talk us through the scale of the problem in the UK and what do to about it. IIt's well worth a listen! WeClock Mary Towers speaks about the potential for workers to collective their data and for the unions to develop apps to support them with that. In this part of the interview Mary mentions that workers could collect selftracking data, data on working hours, commute times, data on how long they have been on their feet during the day, how many breaks they have had, and lots more. All of that information could then be used to inform trade union campaigning fpr better terms and conditions at work. And that's WeClock! Available on the Android Play store and Apple's Appstore... See the WeClock introduction video here:

  • A call for AI Governance

    At the SNF Nostros conference 2021, we discussed how AI is reshaping economies and societies, but also how we connect, how we compete and how we cooperate. What does this mean for the future of work and the future society? Here is the video of the panel - hear us discuss why AI regulation is urgently needed that have human rights at the core. Featuring Dr Christina J. Colclough - The Why Not Lab, Nicolas Economou—Chair, Science, Law, and Society Initiative, The Future Society & Sinan Aral—Director, MIT Initiative on the Digital Economy (IDE) and Moderated by Dimitris Bounias — Project Manager Ideas Zone & Incubator, iMEdD Full afternoon session video here We discuss why AI trustworthiness is currently so weak. Why the Universal Basic Income is not a good idea. How one of our largest challenges is to upskill politicians so they understand what they are trying to regulate. Why managers and workers should co-govern these systems, put human rights first and put algorithmic technologies to good use.

  • Innovations for Good Work

    Panel for the launch of the Royal Society of Art's Good Work Guild on Innovations for Good Work. With Laetitia Vitaud, writer and speaker on the future of work; Thorben Albrecht, policy director, IG Metall; Nchimunya (Chipo) Hamukoma, research manager, Harambee Youth Employment Accelerator; Christel Laudrup Spliid, qualitative consultant, HK Lab; and Christina J. Colclough, The Why Not Lab Innovations are emerging worldwide to address the challenges of a rapidly changing future of work. The pandemic is likely to accelerate the pace of technological change and automation globally. To secure a future where good work is available to all, we will need new approaches to skills, training and lifelong learning, to economic security and to worker voice and power. To launch the RSA Good Work Guild, the panel of good work innovators shares the solutions they have pioneered to support and empower workers in the transition to the jobs of the future; the systemic challenges they have faced in taking new ideas to scale; and the opportunities for innovators, investors and institutional actors to come together to build and sustain system-wide good work innovation, and a global movement for change. Breaking the Monopolisation of Truth & WeClock Starting 35 minutes in, Christina J. Colclough used her moments to build out on the RSA's "Building a Field" for an ecosystem for worker innovators. She urged the RSA to include networks of trusted data analysts, legal expertise around good data stewardship, and data storytellers who understand the workers' struggle and the union cause. Christina introduced WeClock, the privacy-preserving self-tracking app she co-designed with Jonnie Penn, Nathan Freitas and Carrie Winfrey for the Young Workers' Lab at UNI Global Union. She urges unions to use WeClock to access work-related data and break the monopolisation of "truth" that the corporates control. Before moving on the moderator asked Christina for her views on how funders should change some of their fundamental behaviours to get behind worker power. Christina replied. Firstly, foundations need to not just research workers as objects, but as subjects. Foundations must dare go into the workers' realities. Secondly, we need to build an innovation environment that embraces failure, connects organisations, reaches out to young workers and finds their narratives and languages. Failure must be embraced, we need to experiment. And thirdly, stop being so shy to the trade union movement. Many foundations will not finance the union movement directly. This has to stop. Foundations are buying a good conscience. They can't just do lip service to this. Read more about the RSA's Good Work Guild here: See their report here And see their funky Good Work Directory on good work innovations across Europe here:

  • #UnionTech

    Videos, materials and summaries from a 4-part course for FES offered to you courtesy of FES, presenters, participants and moderator Christina Colclough from the Why Not Lab. In April and May 2021, FES - the Friedrich Ebert Stiftung, arranged a 4-part course on #UnionTech as part of their Unions In Transformation program. The Why not Lab moderated the four workshops that were additionally guested by Developer Nathan Freitas, the Guardian Project; AI Expert Dr Jonnie Penn, Cambridge University; and Data Specialist Dan Calacci from the MIT Media Lab. Here is what we got up to. Workshop 1 - Organisation-wide transformation The first workshop zoomed in on trade union transformation in the digital age by discussing if, and how, unions could draw inspiration from existing Data/Digital Maturity Frameworks (DMF). Ensuring an organisation-wide and embedded transformation will be key for trade unions as they reform their strategies, structures and processes to address the digitalisation of work. Drawing inspiration from the amazing DMF designed by DataKind UK and Data Orchard, we focussed on change at the following key levels: Leadership In-house or available skills Organising and campaigning Collective bargaining Digital Understanding See the recording of workshop 1 here and get the slides by clicking the image below. Workshop 2 - Horizon scanning Nathan Freitas, the founder of the Guardian Project, develops open-source, privacy-preserving mobile apps to empower disadvantaged groups. In this workshop, Nathan took us through emerging technologies and how we can safeguard our privacy, autonomy and identity. See Nathan's key recommendations in the slide deck below, and watch the recording of the whole workshop here. (Psssttt. it's well worth a replay as Nathan fires gems away like there is no end to it.... ) Workshop 3: "Now What?" Does a technological revolution lead to a social revolution or the other way round? Dr Jonnie Penn, an historian of AI kicked off this workshop with an affirmation to the unions in the room that a social revolution can shape a technological one. To shape and form the digitalised world to meet the needs of workers, citizens and planet, we need to organise! In this workshop, we covered a wide range of topics from understanding that the current digitalisation of work and workers is an "intelligence inequality" leading to the disempowerment of workers and unions. Dr Penn urged us to be critical to the normalisation of tech - if we use it, use it critically and guarded. See the full recording here, Dr Jonnie Penn's slides below and the summary slides here Workshop 4: Data Storytelling Wrapping up the course, PhD student at the MIT Media Lab, Dan Calacci shared his knowledge and experience with data storytelling. Dan showed how he had helped Schipt workers challenge the platform's claim that their new effort-based model was increasing pay for all workers. Using screenshots of payslips, a dose of machine learning magic, Dan could prove that pay had decreased for 41% of all Schipt workers. From there Dan took us through Feigenbaum and Alamalhodaei pyramid journey from moving from data --> information --> knowledge --> wisdom. Using examples, Dan really shoed how unions could use data storytelling to break the intelligence inequality and power asymmetriens Dr Jonnie Penn discussed with us in Workshop 3. See the workshop recording here (well worth your time!) and Dan's slides below. All material offered to viewers courtesy of presenters, participants and hosts FES.

  • Your Digital Selves' Rights

    In this article for the Danish Insurance Union, we zoom in on how the many digital identities formed on us as workers and citizens can harm career opportunities and the quest for diverse and inclusive labour markets if unions don't start pushing back now. Read the full Danish article here and via images below. Read the original English version below. Your digital selves' rights What happens to all of the photos, google map searches, social media posts, workplace productivity scores, evaluations and “profiles” you have made and be subjected to as you get older, and even die? This might sound like a really odd question, but think about it for a second. As we discussed in the article: It’s Not Just About You, all of your data actually has an impact on other people - not just you. Currently, once data has been gathered, used in algorithms or inferences (remember those often damming profiles that manipulate our life and career chances and those of others too), its out there. It gets replicated, shared, sold, rebundled with other information and sold again. Without a “data life length” - it can live on forever, even when you are gone. This raises many questions: Will we forever be judged against things we did when we were young? Will all of our data profiles continue to affect the live chances of others, even when we are dead? Will your work life opportunities as you grow older be limited by whether you had more sick days than average when you were in your 30s, or have been overweight judging by the norm since you were 40? Are you still investible for an employer? We need to ask these questions as our digital selves (yes we have potentially many selfs), even those we have no idea have been created, have not necessarily been given a natural “out-of-date” “out-of-life” stamp. When we think about it, all of this is really problematic. Your career chances as you grow older can be limited by how others like you, now long passed away, managed to perform. Were they slow - too slow? Were they less adaptable? Softer and therefore more suitable for customer call centres than high speed trading or data crunching? Age discrimination happens. On DR P1 on Monday April 26, a headhunter was interviewed. He reported he numerous times had been asked by clients to screen applicants according to their gender and/or age. I have raised a number of warning flags that we need to take seriously in this datified world of ours. In articles and blogs on the Why Not Lab I have offered some solutions and especially highlighted the key role of trade unions and collective bargaining in turning the tides and making sure our digitalised work life is inclusive, diverse and respectful of our fundamental rights. The Danish Insurance Union has now asked me to suggest some concrete solutions and policies that unions beneficially should explore. I will do so in this article and two more to come. Now back to the rights of your digital selves. How can we avoid being subject to manipulations based on old data or data from folks now long gone? How can we ensure that our life’s don’t continue to affect others, when we ourselves have passed away? The GDPR gives us a helping hand The European General Data Protection Regulation is founded on seven guiding principles. These are described in GDPR Article 5. A key principle is that of data minimisation (art 5(c)): Personal data shall be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (‘data minimisation’); What article 5 says is that data controllers may only collect the necessary data, and only the necessary data, for the given purpose and only the given purpose, and stored for only the necessary amount of time. If respected, this will indeed help us to prevent that our digital selves continue to affect others - and ourselves - for eternity. However, all is not that simple. Do we actually know what data processing is taking place? Although companies are obliged to tell you, do they? You have a right to be informed about: The collection of data How the company plans to use the data The reason why they are collecting the data Can the purpose of collecting the data be achieved without collecting the data? How long will the data be stored to fulfill the purpose? Is the data periodically reviewed in relation to the above 5 points and deleted if required? In accordance with article 16 in the GDPR, you also have a right to edit the personal data collected on you, and article 17, gives you additionally the right to have the data erased if the data is no longer necessary to fulfill the purpose to which they were collected. Article 17 is key to preventing that data and data inferences live on forever. Ensuring compliance with this article is therefore really important for the union policies of the right to a long working life. What should the union do? This gives your union, who can be mandated to represent you (see article 80 GDPR), many possibilities. They must insure: That the workers have been informed about all data collection (points 1-6 above); Help you through “data subject access requests” exercise your right to correct the data held; From your employer a record of whether and how the company is compliant with the GDPR’s seven principles - including the one on how long data can be stored; That companies are in compliance with article 35 on data protection impact assessments (DPIA). As the processing of personal data at work is a “high-risk” operation, companies are actually obliged to consult with the workers when writing the DPIA. I have knowledge of just 2 examples of this actually happening. Are we ok then? No. Whilst all of the above measures can really help protect our rights, and prevent the eternal influence of out-of-date data and data profiles, we are faced with additional changes. Firstly, the geographical scope of the GDPR is established in article 3. Whilst this offers a broad protection of our rights, the world extends beyond the GDPR’s boundaries. Algorithmic systems can be trained on data from other countries and regions, and therefore indirectly influence the outcome even when the GDPR is complied with. Secondly, the GDPR poorly defines our rights in relation to data profiles we are subject to, but that have nothing directly to do with our own personal data. As mentioned in my article in Forsikring-1 2021, unions have an important role here. Let’s imagine an automated hiring system that your company has bought from a company in the United States. Maybe this system has been trained on data and data inferences from segments of workers in the US. It is then instructed to sort applicants according to certain phrases, words, experiences, characteristics. Now what if the algorithm has “learnt” that someone with a particular education, from a particular decade and of a particular gender is most likely not to stay with the company for that long? If you are an applicant with similar characteristics, do you think you will make it to the interview? Most probably not. This is one of the reasons why unions must demand a seat at the table in governing these algorithmic systems. You need to be in a position to ask what data the algorithm is trained on, what characteristics, words, phrases, inferences it is trained to judge as positive or negative. And you must ask: how will all of this influence the union goal of ensuring diverse and inclusive labour markets? In conclusion Unions simply must capacity build to conquer these important issues. Shop stewards should be trained so they can be the digital watchdogs in the workplaces so you can enjoy a long working life free from data-driven manipulations that prevent you and others from fulfilling your potential. Lastly, given that digital tools really don’t care about national boundaries, only law, unions must cooperate internationally to push governments to regulate these systems globally so all workers, no matter where they are, can enjoy the same strong rights and protections.

  • AI and the Labour Market

    - intervention at public hearing at the European Parliament May 25, 2021. Organised by the AIDA and EMPL committees I was honoured to be invited to address MEPs from the European Parliament's AIDA and EMPL committees. Read my script below, and see meeting documents here. Good morning and thank you chair. In the below I will be suggesting 6 key policy areas. But let me start by revisiting the past. In 1919 at the end of the 1st WW world leaders signed the Treaty of Versailles. In that they agreed that “labour should not be regarded merely as a commodity or article of commerce”. This was reconfirmed in 1944 at the end of the II WW in the Declaration of Philadelphia, now art 1: Labour is not a commodity. With the millions of data points extracted from workers on a daily basis, turning their actions and non-actions into mathematically defined “truths” or statistically calculated probabilities, we must ask: Are we betraying history? Whilst digital systems can be effective, and can be productive, we must ask: EFFECTIVE FOR WHAT, PRODUCTIVE FOR WHAT. Efficiency and productivity do not necessarily mean “good”, “fair” or even “legal”. Our industrial relations systems are changing too - not least as a result of the many procured tasks and proprietary software. We are experiencing a changing balance of power in workplaces. As proprietary systems are introduced in workplaces, their logic, norms, instructions are muddling the traditional labour-management relations. We must ask: Who is really deciding what? Developers should know what the human rights, social, climate and/or economic impacts are or could be - but do they? Deploying managers should too. But do they? And can they unilaterally determine this? And this leads me to offer my 6 points: Whilst the GDPR offers some strong rights to workers, there are also some profound weaknesses. As pointed out by Professor Sandra Wachter and Bernt Mittelstadt, workers (indeed citizens) only have access to inferences that are directly related to their own personal data. Yet the majority of inferences that influence our lives, are not directly related to us. Think of your Netflix recommendations, your facebook news feeds, or think about why a worker is not called for an interview despite fulfilling all formal requirements. Maybe he or she, or you, have fallen victim to an opaque algorithm? We need to know! We need much stronger rights in this area. To prevent the unabated quantification of workers, workers’ collective data rights need vastly improving. This relates to the need for much better regulation around data access and control. Workers must have the right to pool the data extracted on them and use it responsibly for the collective's benefit. Research on workers' data trusts or data collectives is urgently needed. Algorithmic systems deployed in workplaces cannot simply be governed unilaterally by management. We need to find models for the co-governance of these systems that respects IR and recognises the important role of social dialogue. And now, if I may, allow me to mention a few words about disruption and skills: Many like to speak of the future of work as if it essentially is a debate around skills and especially STEM. This is a dangerous reduction of a complex, multifaceted change to work, workers, the social contract and rights. STEM simply cannot stand alone without the humanities. The current debates around AI Ethics proves the point. No system, be it a biological, economic or human system can survive if there is not sufficient diversity. The same goes for the labour market. We need workers with all sorts of skills and experiences, and a labour market that honours and respects the labour of workers no matter if they are in low-valued or high-valued and thus typically low or high paid jobs. In my 6th and final point, I wish to stress that disruption must go hand in hand with obligations. I would urge politicians to look into what these should be. We must commit employers to invest in the competencies and career paths of the affected workers- up and down supply chains. With the proposals above, we can ensure through social dialogue, regulation and collective bargaining diverse and inclusive labour markets for generations to come. It is, to be frank, your responsibility to urgently turn the tides and prevent the irreversible commodification of work and workers. Thank you

  • Meaningful Inclusivity

    - in Governing the AI Revolution Joining Beeban Kidron, expert on children and AI and who film director and the chair of 5Rights Foundation; Renée Cummings with the School of Data Science at the University of Virginia on women and minorities; Helena Leurent, director-general of Consumers International on consumers; and Patrick Lafayette, the CEO of Twin Audio Network on people with disabilities this panel was all about how to include underrepresented groups into the governance of AI. Tasked to speak about the inclusivity of workers in workplace AI governance, hear me argue why the current commodification of workers must be stopped, why dialogue needs to be fashionable again, and why any form of governance void of the workers' voice is not governance at all. Here is what I said (starting 1:15:00 in): Transcript CHRISTINA COLCLOUGH: Thank you very much. It is always a little bit daunting being the last one because so many good and valid points have been made. I want to pick up on what Patrick said around the governance of these new technologies of AI. In workplaces this governance is in the majority of cases totally lacking when it involves the inclusion of workers. Their voices, their agreement to first the surveillance they are subject to but also to how their data is used, how it is inferred, for what purposes, if the data is sold, and so on. This is stunning to me. It is stunning that the majority of us in various forms, being self-employed, in the informal economy, as employed workers, and as digital labor platform workers, we are workers, and yet this whole notion of co-governance of algorithmic systems is totally out of fashion. And here a little wink-wink with a smile to Renée, in your work you do with the C-suite, include the workers. As I said to the OECD ministers when they adopted their great AI principles, which I saw Joanna was referring to before in the Chat, I said to them: "This is great. Now you must ask, fair for whom?" What is fair for one group of, in my case, workers, is not necessarily fair for the other. How can we make those tradeoffs and those decisions explicit but also consensual in the sense that we at times might have to have positive discrimination towards one group, and what is our preparedness for this? Then you can ask: What if we don't do this? The Why Not Lab, why not? What would happen if we don't do all of this? Then I am afraid that the current commodification of work and workers will continue to the extent that it is almost beyond repair when the inferences about us predict our behavior, where we as the humans become irrelevant, where we might be chatting in three years at a conference like this around how do we defend the right to be human. For all of our fallacies and beauties and good sides and bad sides, this is what is at stake. Unions have always fought for diverse and inclusive labor markets, and I am very afraid—and I think Renée's work in criminology points in this direction—that we are heading toward a judgment against a statistical norm that will exclude lots and lots of people and therefore harm the diversity and inclusion of our labor markets. My call here is very much let's find a model for the co-governance of these systems. Let's put workers first. We have the principles in AI of people and planet first. But we cannot do that if we actually do not bring dialogue back into vogue. It is also very telling that if you look at the data protection regulations across the world, either workers are directly exempt from being able to enjoy those protections or worker's data rights are very, very poorly defined. We have that in the CPIA. We have that in Thailand, in Australia. The GDPR even had in their draft GDPR stronger articles on specifically workers' data. My call here would be to bring dialogue back into vogue. We have to stop enemizing one another. We should definitely work on workers' collective data rights, move away from the individual notion of rights as enshrined in much of our law to collective data rights. We need to balance out this power asymmetry which is growing to a dangerous and as I said irreparable level, and then we must talk about the regulatory requirement for the co-governance of these systems, thereby not saying that workers should have the responsibility that must lie with the companies, the organizations who are deploying these systems. We need much stronger transparency requirements between the developers and the deployers of these technologies. We must avoid a situation where the developers can hide behind intellectual property or trade secrets to not adjust their algorithms, their training data, and so on. My last call is we need our governments to up their game. This cannot work under national different laws. We need the Global Partnership on AI (GPAI). We need OECD. We need the United Nations to start working towards a global governance regime that also caters for value and supply chains and that also caters for varying economic situations in each country, that we must stop what I call the "colonialization" of much of the developing countries around this digital change. This is a macro thing. We need governments to regulate, we really need to get them to the table—I am on the board of GPAI, and can say there is resistance to commit to any form of joint regulation—and we need companies to include their workers. Data protection impact assessments on workers' data must include the workers. Algorithmic systems deployed on workers for scheduling, hiring, or whatever must include the workers. Then we have to all stop enemizing one another, and we must also realize that most of the people listening to this are workers, and we should have a voice. Thank you. /end/ - the panel was hosted by the International Congress for the Governance of AI

  • Det handler ikke kun om dig..

    Den nyeste artikel i serien om dataficering af vores arbejde og arbejdspladser er publiceret i dag. Skrevet for Forsikringsforbundet i Danmark. "Det handler om, at vores liv kan formes og begrænses af en masse dataprofiler, vi slet ikke kender til, fra personer vi sikkert aldrig har mødt. Omvendt påvirker profiler, der bygger på vores egne handlinger, andre. Vi skal have nogle rettigheder over de profiler. Vi skal vide, hvad de er, og hvad de bruges til, kunne modsætte os dem. Få dem rettet til eller ligefrem slettet. Og vi skal kunne forbyde, at de bliver solgt til tredjepart." I artiklen argumenterer jeg for hvorfor en stærk faglige respons er påkrævet for at beskytte medarbejderens menneskerettigheder, privatliv og retten til at forme of skabe deres liv frie for algoritmisk manipulation. Læs hele artiklen her:

  • Audits & Impact Assessments 2.0

    Across the world, a new wave of audits and/or impact assessments for digital technologies are popping up. None include workers in the process. This must change, argues this blog. Jonathan Guy from the Australian Education Union (AEU) argues in this powerful article why unions need to be party to audits concerning the need, use and impact of digital technologies. Jonathan provides a convincing case: As a response to the closure of schools due to the pandemic, the Australian Federal Government lowered the price of broadband. But research the AEU had just conducted showed that although a small proportion of all students in Australia (5%) do not have internet access on any device, public school students are overrepresented among those without access: 125,000 of them have no internet access on any device and they were 2.5 times more likely than private school students to have no internet access at home. Aboriginal and Torres Strait Islander public school students were four times as likely as non-Indigenous students to have no internet access at home—21% vs 5%. The study also revealed that almost a third of students living in very remote areas have no internet access. Students from low-income households, 80% of whom attend public schools, are 9 times more likely to lack internet access at home than students from high income households. Guy, rightfully, remarks: the government initiative is of little use to those without devices or existing broadband connections. Jonathan Guy calls for digital equity audits that should be carried out at a national level together with education unions in order to provide evidence for comprehensive action plans. They must also take into account the relationship of COVID-19-related remote learning and ongoing disadvantage due to: lack of digital inclusion, potential long-term impact on the achievement of students by home internet access, family income, remoteness, mobility, family type, English proficiency, disability, housing, and Aboriginal and Torres Strait Islander status. The point here is that had the Australian government reached out to the very unions who know what is at stake in the education sector, their federal solution would have been far more nuanced. Back in Vogue The call from Jonathan Guy is significant, and unions from all sectors should echo it. However, whilst audit and impact assessments are back in vogue across the world, and many models are being created, none, simply none, include the workers and their unions. The Ada Lovelace Institute published a report in 2020 "Examining the Black Box" - it includes this overview of assessing algorithmic systems. Again none include the workers nor the unions: ForHumanity - an all-volunteer organisation aimed to bring together experts who are convinced that mitigating risk from the perspective of Ethics, Bias, Privacy, Trust, and Cybersecurity in autonomous systems will lead to a better world. They created a taxonomy that distinguishes between 3rd party independent audits, internal audits, assurance, consulting, and more. Again nothing is mentioned about the employees/unions' role in participating in, or approving, audits. Unions must respond Whilst especially industry is pushing for - albeit slightly improved - audits and impact assessments, we should also expect they are doing so in the hope to avoid more intrusive regulation. In my work in the OECD One AI expert group and elsewhere, I have read a growing number of company audit and impact assessments. None of them include the workers' voice. This even within the EU, where companies are actually obliged to create data protection impact assessments (DPIAs) and to consult with the workers when digital technologies process workers' personal or personally identifiable information. So what's the problem? Whilst the majority of the audits I have seen actually include articles on human rights, social, fairness and equity impacts, if these audits are conducted by management alone, it is - honestly - hard to take them seriously. We must, for example, ask "fair for whom"? For an individual, for groups, what groups? What is fair for one group might be very unfair for another. What compromises is the company making, and do the workers agree? If not, what are the remedies? The failure to meet the real challenges in Australia as portrayed in Jonathan Guy's blog, so clearly shows how the government response could have been far better had AEU be included. By excluding the staff reps/shop stewards, and therefore the voice of the workers, companies risk approving potentially highly discriminative algorithmic systems. Trade unions have traditionally been the guardians of inclusive and diverse labour markets. The staff reps/shop stewards are also those closest to the workers. They know the sentiments of their colleagues and the lived experiences of discrimination, privacy violations and exclusion. No workplace or labour market audit system or impact assessment is worth the paper it is written on, if the voice of the workers is not an equal partner in its formation.

  • Gig Workers Fighting Back

    New article in Wired features WeClock - our self-tracking app for workers - in article by journalist Aarian Marshall: "Gig Workers Gather Their Own Data to Check the Algorithm’s Math". Read full article here, excerpt below Uber Eats delivery worker Armin Samii found that the company might have underpaid him by not considering the route he had to follow. PHOTOGRAPH: MAIRO CINQUETTI/NURPHOTO/GETTY IMAGES Samii is a software engineer. So he created a Google Chrome extension, UberCheats, that helps workers spot pay discrepancies. The extension automatically extracts the start and end points of each trip and calculates the shortest travel distance between the two. If that distance doesn’t match up with what Uber paid for, the extension marks it for closer examination. So far, only a few hundred workers have installed UberCheats on their browsers. Samii doesn’t have access to how couriers are using the extension, but he says some have told him they’ve used it to find pay inconsistencies. Google briefly removed the extension last week when Uber flagged it as a trademark violation, but reversed its decision after Samii appealed. The digital tool joins others popping up to help freelancers wrest back control over work directed by opaque algorithms, with pay structures that might change at any time. The need has only grown during the pandemic, which has seen companies like DoorDash, Amazon, and Instacart hire more contractors to support spikes in demand for deliveries. The expansion of the gig economy might be here to stay: The US Bureau of Labor Statistics projects the “courier and messenger” sector could grow 13 to 30 percent more by 2029 than it would have without a pandemic. Globally, up to 55 million people work as gig workers, according to the research and advocacy group Fairwork. “Things are changed and hidden behind an algorithm, which makes it harder to figure out what you’re earning and spending and whether you’re getting screwed.” KATIE WELLS, RESEARCH FELLOW, GEORGETOWN The projects stem from practical need. In the US, many gig workers keep track of their miles and expenses for tax purposes. But the projects also grow out of workers’ growing mistrust of the companies that pay their wages. “I knew about gig companies’ business decisions that meant they weren’t paying well,” says Samii. But he says he hadn’t thought the apps might “pay for less work than you actually did.” The tools are particularly helpful to gig workers because of their low wages, and because it can be hard for isolated workers to share or find information about how the job pays, says Katie Wells, a research fellow at Georgetown University who studies labor. “Things are changed and hidden behind an algorithm, which makes it harder to figure out what you’re earning and spending and whether you’re getting screwed,” Wells says. ...... ...... Driver's Seat But some workers have been drawn to homegrown tools built by other gig workers—and the idea that they might themselves profit off the information that companies collect about them. Driver’s Seat Cooperative launched in 2019 to help workers collect and analyze their own data from ride-hail and delivery apps like Uber, Lyft, DoorDash, and Instacart. More than 600 gig workers in 40 cities have pooled their information through the cooperative, which helps them decide when and where to sign on to each app to make the most money, and how much they are making, after expenses. In turn, the company hopes to sell the data to transportation agencies interested in learning more about gig work, and pass on the profits to cooperative members. Only one city agency, in San Francisco, has paid for the data thus far, for a local mobility study that sent $45,700 to Driver’s Seat. .... .... WeClock An open source project called WeClock, launched by the UNI Global Union, seeks to help workers collect and then visualize data on their wages and working conditions, tapping into smartphone sensors to determine how often they’re sitting, standing, or traveling, and how they feel when they’re on the job. Once it’s collected, workers control their own information and how it's used, says Christina Colclough, who helped build the app and now runs an organizing consultancy called the Why Not Lab. “We don’t want to further the surveillance that workers are already subjected to,” she says. “We don’t want to be Big Tech with a conscience.” Colclough, The Why Not Lab Colclough hopes that, eventually, workers might use WeClock to show they’re working longer hours than agreed. For now, the app is being used by 15 freelance UK TV production workers, who say that production companies don’t always pay fair wages for all the work they do. The participants in the pilot use Apple Watches to track their movements while on set. “I love my job,” says one production sound crew worker, who is using WeClock. (The worker asked not be named, for fear of retaliation in a close-knit industry.) “But I hope this can help expose a little bit of the ridiculous hours we work.” Read full article on Wired:

  • Ethics of AI in the Workplace - panel @ OECD

    Held on February 5, 2021 at the OECD International conference on AI in Work, Innovation, Productivity and Skills, this panel went straight to the core of AI in Work. We discuss worker power, the need for regulation and the absolute requirement to govern AI in workplaces in respect of fundamental rights. See the recording here. Panel Description What are the main ethical issues raised by the use of AI in the workplace? What tools can be used to make sure that humans are put first – and human centred values respected – when AI is used in the workplace? What safeguards should be considered to ensure transparency, explainability, safety and accountability? This are some of the questions that panellists were called to discuss.

  • Upskilling for Shared Prosperity

    Report by World Economic Forum and PWC includes four Calls to Action including the Why Not Lab's "People Plan". Published January 2021, the report concludes that: To make large-scale upskilling across global economies and societies a reality, government, business, education, civil society and other leaders will need to work together in a more agile, resilient and inclusive manner. The call to action outlined in this section focuses on how to close the skills gap and prepare people for jobs now and in the future – starting primarily with secondary education. In many countries, this starts with access to basic health and nutrition, early education and connectivity, areas that the UN Sustainable Development Goals seek to address. For the purposes of this report, closing the skills gap relies on a series of levers that are all underpinned by public-private cooperation: providing lifelong learning and upskilling, proactive redeployment and re-employment, funding and the ability to anticipate what skills are needed in the job market. Actions should be focused on both the supply side – the upskilling of people – and the demand side – the jobs for those workers. The former requires a collaborative ecosystem across government, business and education. The demand side will require a new focus on the types of jobs that people do and the need to make these good jobs safe, fulfilling and inclusive. Based on the analysis and extensive expert consultations, this report identifies four key areas that demand new approaches to upskilling and urgent action by governments, businesses and other stakeholders: 4 key areas requiring urgent action All stakeholders: Build a strong and interconnected ecosystem committed to a comprehensive upskilling agenda and give people the opportunity to participate Government: Adopt an agile approach to driving national upskilling initiatives, working with business, non-profits and the education sector Business: Anchor upskilling and workforce investment as a core business principle and make time-bound pledges to act Education providers: Embrace the future of work as a source of reinvention to normalize lifelong learning for all The Why Not Lab fully supports these identified areas in need of novel approaches and urgent action. We advocate strongly for the People Plan identified in the report as area 3: What's the 'People Plan'? The idea here is that companies and public services, should be obliged to invest in the upskilling or reskilling of their staff/workers when investments are made in new, disruptive technologies. As such employers will commit to the career development of workers - be it within or outside of the company/service's boundaries. Before the disruptive technologies are introduced the company, together with the workers and the unions, should map skills and competency profiles. Further training should be co-determined. To ensure that all workers, regardless of their private responsibilities, have access to the training, it must take place within working time and be regarded as an element of their work duties - not an addition to. Unions could beneficially support this shift in focus from job security to career security. The career can be within the same line of duty, or be totally different. The Finance Services Union in Denmark, who has adopted this strategy, is supporting their members in cooperation with their employers to retrain into new job roles in the financial sector, or into totally different occupations. Focussing on career security rather than job security is a sustainable path to continuous employment. One that embraces flexibility and change without leaving workers behind. Competencies - the missing element in the report Whilst the Why Not Lab is in agreement with many of the report's recommendations, there is also one key omission. Namely that of workers' competencies in addition to skills. You might be appreciated for being the caring one amongst your colleagues. The one who notices when your co-workers are feeling down, or need a little more attention. Or you might be the change-maker. He or she who dares break boundaries and suggest sometimes wild new ways of doing things. Or you are system person - keeping things in working order, spotting anomalies and correcting processes so everyone and everything functions well. These human competencies are highly important yet seldom appraised. We certainly don't write on our LinkedIn profile that "I am the heart of the organisation". Yet shouldn't we? Human competencies are the least automatable, they are what makes a team function well - or the opposite. They also are key expressions of each individual's personality. So here's to a greater focus on the role of competencies in the future of skills debates. We must, as workers, managers and colleagues alike, become far more comfortable with and willing to appraise our co-workers for their human competencies. Demos Helsinki in Finland reportedly does this: every year before Christmas each worker is asked to appraise a number of their colleagues for their human competencies. As the company closes over the holiday, every worker receives a Book of Appraisal. Its an amazing idea. And one we all should be inspired by. Read the report "Upskilling for Shared Prosperity" by World Economic Forum in cooperation with PWC here

  • Data Rights are Labour Rights

    Mozilla's Internet Health Report 2020 article "Data Rights are Labour Rights" includes mention of WeClock and Lighthouse - the two tools for workers and unions we co-developed. We are interviewed with James Farrar, Fair.Work, Dan Calacci and Keith Porcaro on how labour must unlock data power and fight for much stronger data rights. Data rights are labor rights, especially when it comes to the platforms of the gig economy. Leveraging data for the collective good is essential for the future of work and internet health. Mozilla, JANUARY 2021 Image from Mozilla 2020 When driving for Uber one night in London in 2015, James Farrar was assaulted by a passenger. What began as an uneventful Friday, ended with aggression that spilled out of his car and onto the roadway. It was a jarring event, but Farrar assumed that Uber would quickly identify the aggressors and report them to the police. Instead, there was silence for weeks, and he saw it as a sign of disrespect for him as a driver for the company. It was among the first of many instances where Farrar felt Uber held data that intimately concerned him, although he could not access it. It caused him to pore over his contract, in which he noticed the emphasis placed on drivers being self-employed as opposed to employees. Farrar bristles at this: “If I’m my own boss, running my own business, and you’re just my agent — how come I can’t know who my own customer is?” He describes the experience as the catalyst that got him engaged in campaigning for labor rights for app drivers. Together with another former driver, Yaseen Aslam, he later founded the App Drivers & Couriers Union (ADCU). After six years of legal battles, Farrar and Aslam currently await a landmark ruling by the United Kingdom’s Supreme Court on whether app drivers in fact are employees, and therefore entitled to rights like minimum wage and holiday pay. GIG WORK IS EVERYWHERE An estimated 50 million gig workers worldwide toil within ecosystems created by online platforms such as Uber, Ola Cabs, iFood, Grab, Helpling, and dozens of others. It is a global phenomenon that contributes to the commodification of labor in the context of limited data rights for workers worldwide. And so, expanding data rights and protections would have ramifications for the future of work as well as for internet health. “A majority of gig workers are in the Global South,” says Funda Ustek-Spilda, a researcher and project manager at Fairwork, an international research community studying the global platform economy. She notes that the risks associated with gig work disproportionately affect systematically vulnerable communities everywhere. In Europe and North America, for instance, gig workers are more likely to be people of color. Powered by individual and aggregated data, the automated systems employed by platforms play a huge role in determining who is offered work at what price. At the same time, it is a business model that tends to rely on an oversupply of labor — a convenient system for consumers who want cheap, immediate service, but often leaves workers idling, underpaid, and at the mercy of opaque algorithms. Workers, for the most part, have no way of knowing exactly how platforms determine prices, nor can they opt out of surveillance or dispute negative performance claims. DIGITAL RIGHTS ARE LABOR RIGHTS In his own legal exchanges with Uber, Farrar discovered several details about the data they collected about him and what they inferred about him as a driver. For instance, he learned that Uber maintained a secret profile that included electronic performance tags such as ‘missed eta’ or ‘negative attitude’ — tags Farrar attained for refusing work he believed would be unprofitable or for requesting reimbursement of an airport parking fee. Farrar was never informed of this nor does he know how the tags were processed. But he suspects performance factors play into work allocation systems. “Drivers are led to believe that they are working in a completely open market, but if work is being throttled for some people due to performance factors, they deserve to know and have a right to appeal,” he says. For individuals and collectives, gaining access to more personal data has the potential to reveal the inner workings of secretive processes. It demonstrates just how tightly bound digital rights now are to labor rights. In response, a gradually emerging tenet of labor organizing is centered on unlocking power from data. Christina J. Colclough of The Why Not Lab is an advocate for global labor rights who has dedicated years to highlighting why labor unions must see data rights and governance as an urgent priority. She sees digitization and worker surveillance increasing in numerous sectors, beyond the gig economy, and in ways that have only accelerated since the COVID-19 pandemic. “The tech world’s super surveillance is exploding, and the power asymmetries in the labor market are growing,” says Colclough. “Honestly, I would say that this is a situation where unless organized labor begins to respond, it will soon be too late. But the majority of unions are not engaging on this yet,” she says. The picture is complicated by the fact that a lot of digital technology in the workplace is proprietary and developed by third-parties. The management of a company deploying such technologies may have limited rights to alter them, let alone know how to govern them. Colclough thinks unions should be thinking of ways to co-govern algorithmic systems and to negotiate for much stronger collective data rights for workers across what she calls the “data life cycle”. At every stage, from collection to analysis to storage, and potential transfer to third parties, there are rights to be negotiated, she says. DATA UNDER LOCK AND KEY Requesting, obtaining and organizing data is more difficult in practice than describing it in theory. Gig work platforms have few incentives to share data they see as central to their business. Worse, when it comes to estimating things like hourly wages, companies have an interest in inflating numbers and obfuscating independent research. “The issue, philosophically, is that these apps collect all of the information that we would ever need to know, but the only way to understand it from a research or worker-organizing perspective is still to gather the information yourself,” says Dan Calacci, a PhD student at the Human Dynamics group at the MIT Media Lab. Calacci has contributed to a number of innovative technical projects to enable workers to collect useful data themselves. One is WeClock, an open source app created by partners including the UNI Global Union’s Young Workers’ Lab (led by Colclough) together with Guardian Project and OkThanks. The app enables workers to “quantify” their labor to document their own work hours, travel distances, and pay rates, for collective action or evidence of productivity. The app keeps all data local and gives workers control over who it is shared with. Another app, created by Calacci with the labor activism group,, is an SMS-bot called the Shopper Transparency Calculator. When Shipt, a large on-demand shopping app in the United States changed their pay structure from a flat rate to a blackbox algorithmic calculation, their “shoppers” could not estimate how it would affect wages. In October 2020, Calacci published a study based on thousands of screenshots from 213 workers who used the app showing that 40% of shoppers had their pay cut. As they protested, the results of the study were disputed by Shipt without further clarification. DATA FUTURES IN FLUX Technically, there are challenges of enabling workers across many different gig platform apps (often used simultaneously) to better understand how their wages are calculated across different mobile devices, but Calacci says one of the biggest challenges is to develop models for data governance that are easily applicable in practice and appropriately sensitive to the risks and responsibilities of collecting sensitive personal data. “Projects can say that they are concerned with privacy but then still go on to collect as much data as possible, even though it might not be in the best interests of workers,” he says. In 2020, research by Mozilla’s Data Futures Lab concluded that there generally still isn’t clarity around how new ideas for data governance (like data trusts, data commons, data collaboratives, and more) will work in practice. Nor is it clear how it could be easier for grassroots initiatives in different jurisdictions to organize and manage data responsibly. To complicate matters, terms are often used interchangeably with different definitions. In order to shed light on the questions that unions need to ask about data, Prospect in the United Kingdom launched a quiz developed by Keith Porcaro called Lighthouse​. It is designed to help unions think about how to use and handle data (or not) with checklist items like, “Each of our digital assets have someone responsible for managing and safeguarding them”. Colclough says most unions do not have data analysts on staff, but that she sees new organizations innovating on “collectiveness” in data governance. Driver’s Seat Cooperative is one such organization. They enable app drivers in the U.S. to pool together data about their movements and earnings. “We’re ensuring that a broader set of stakeholders have access to the data,” says Hays Witt, co-founder and CEO of the cooperative, which is owned and governed by its members. “And we use this to give drivers insights into where and when the best place to work is, and which platforms pay better.” At the same time, Driver’s Seat also markets mobility insights to city planners and transportation agencies, sharing dividends of earnings with drivers. FIGHTS FOR FAIRNESS Since the pandemic, collective action like strikes and unions are growing more frequent. In the spring and summer of 2020, gig workers in Brazil protested by the thousands, crowding into downtown São Paulo and demanding better pay, working conditions, and health safety measures from platforms like Uber and iFood. “They are making us work weekends, every day, or we face the risk of getting blocked,” one iFood worker told Al Jazeera in July 2020. Similar protests spread across the continent, springing up in Mexico, Chile, and elsewhere, culminating in a movement dubbed “Breque dos Apps”. The strikes and protests brought media attention to gig workers, and galvanized support for a bill in the Brazilian parliament in July 2020 that would guarantee an hourly rate and paid vacation. The central question of how internet and app-based gig work is classified has been hotly contested in different countries, with different outcomes. In California, leading up to elections in November 2020, Uber, Lyft, Doordash, and Instacart reportedly poured $200 million USD into a successful campaign for a ballot initiative (Proposition 22) that now exempts their gig workers only from certain benefits and protections. In September 2020, Spain’s Supreme Court ruled that drivers for the food delivery app Glovo, based in Barcelona, were employees not freelancers. Regardless of employment classification, Fairwork argues in their assessments of gig platforms in different countries, that there are principles like fair pay, fair contracts, and fair management that all should adhere to. Some do better than others. “One of our objectives as a union is to take control of our personal data,” says Farrar about the ADCU. In Europe, a tactic has been to support drivers in filing for access to personal information by leveraging provisions of the European Union’s General Data Protection Regulation (GDPR) that guarantees access to personal data held by companies. Even more leverage could be on the horizon, as improving protections for people working in the platform economy has been announced as a new key European Union initiative for 2021. Longterm, the ADCU is working with Worker Info Exchange to collect and pool app-driver data in order to gain collective power in dealings with the companies [Mozilla is supporting this initiative with a grant through its Data Futures Lab]. But according to Worker Info Exchange, both Uber and another major platform, Ola Cabs, have deliberately frustrated data access rights through “lengthy and convoluted request processes often resulting in little or no data being provided”. In response, the ADCU and Worker Info Exchange brought a lawsuit in July 2020 on behalf of 13 drivers in the United Kingdom against both Ola Cabs and Uber in The Netherlands. They requested access to “secret worker performance profiles” and “fraud probability scores” that neither company wishes to disclose. Uber has publicly countered that they have shared everything drivers are entitled to and would “never share any data which would infringe the rights of riders, such as individual rider ratings, feedback, and complaints”. The ADCU together with Worker Info Exchange brought an additional suit against Uber in October 2020 on behalf of eight drivers from the UK, the Netherlands and Portugal who they claim were algorithmically fired or “robo-fired” for unspecific allegations of “fraud” that they each deny but have had no meaningful opportunity to appeal. Uber has countered that at least two humans review any decision to deactivate an account in Europe. The three cases the ADCU and Worker Info Exchange have brought in the UK and in the Netherlands could have lasting consequences for the rights of app-drivers and other gig workers beyond Europe’s borders too. As the conditions of workers become as dependent on their data contexts as their physical conditions, the parameters of digital rights movements must shift as well. Or, as Farrar elaborates, “If we can’t access our digital rights, we won’t be able to access our worker rights.” ************************** Access original article here

bottom of page