86 items found for ""
- Taming the Algorithm - why workers need a voice on tech
Held as a TUC 2020 Congress fringe event on Sep 14, 2020, this webinar was attended by over 100 shop stewards, experts and unionists. Chaired by Prospect General Secretary Mike Clancy, listen up for Professor Lina Dencik's, Andrew Pakes' and Dr Christina Colclough's views on why workers need a seat at the table.
- Workers’ rights: negotiating and co-governing digital systems at work
Published by Social Europe on September 3, 2020, this article discusses why and how unions should negotiate for much stronger worker data rights and a seat at the table in governing algorithmic systems. Extract below - full article here Covid-19 has all too clearly shown how digital tools have become an integral part of work—be they used for online meetings or for the increasing surveillance and monitoring of workers at home or in the workplace. Indeed, as businesses try to mitigate the risks and get employees to come back to the workplace, they are introducing a vast array of applications and wearables. In many cases, employees are left to accept these new surveillance technologies or risk losing their jobs. Covid-19 has thus expanded a power divide which was already growing, allowing the owners and managers of these technologies to extract, and capitalise upon, more and more data from workers. Strong union responses are immediately required to balance out this power asymmetry and safeguard workers’ privacy and human rights. Improvements to collective agreements as well as to regulatory environments are urgently needed. Coordinated action is needed to defend workers’ rights to shape their working lives, free from the oppression of opaque algorithms and predictive analyses conducted by known and unknown firms. Negotiating the Data Life Cycle at Work Unions must negotiate for much stronger data rights for workers across these 4 main stages of the data life cycle at work. Co-governance of Algorithmic Systems They also need to demand a seat at the table in the governance of algorithmic systems at work: Trade unions, especially within the GDPR zone, have a range of rights and tools to limit the threats to workers’ privacy and human rights. These should be utilised and urgently prioritised to prevent the further commodification of workers. Moving towards collective rights over, and co-regulation of, algorithmic systems is an important step in maintaining the power of unions. As the demand for digital tools to monitor and survey workers continues to rise, unions simply cannot afford not to give these issues utmost priority. Read the full article for a rundown of these key demands!
- Employers are tracking us. Let's track them back
Article by 5 Media based on interview with me. Published September 4, 2020. Illustrations by Morten Voigt Full article available on: https://fivemedia.com/articles/employers-are-tracking-us-lets-track-them-back/ Employers are using tech to track their employees ever more closely. Time for workers to reclaim their own data – and turn the surveillance back on their taskmasters, says Christina Colclough. Here’s what you need to know about your data, your rights, and how you can make sure they are protected. Knowledge is power “We have sleepwalked into this situation,” says Christina wistfully. “Tech has run amok in the last decades.” Indeed, it’s hard to ignore how technology has become an all-consuming aspect of modern life, while laws struggle to keep up. Many of the services that we all rely on are run by huge private businesses with more power and money than many countries, and require us to sign our data away with little understanding of where it really goes. In the modern economy, data is power. For employers, the power they hope to gain from intelligent data systems is even more direct: the promise of employee surveillance is to boost productivity, gain competitive advantage and thereby grow profits. It also cements the position of power that employers have over employees – hence Christina’s concerns. Mine your own business It’s time, Christina says, for the “workers to start kicking back”. This is the idea behind Christina’s new app, WeClock (available now for Android devices and in beta for Apple devices) which promises to “give work a reality check”. Using the app, workers can track things like how far they have to travel to work, whether they’re taking their allotted breaks, and how long they spend working out of hours. They can then share this data with their union, which can use it, not to sell them things, but as ammunition for the next negotiation. It also provides an accurate and up-to-date source of aggregate data about key issues affecting worker wellbeing. Ideally, Christina says, it would be unions who owned employees’ work data, which they could then allow employers to view (but not necessarily keep) on agreed terms. That relies on workers being unionised, which relies on them feeling a sense of common cause and solidarity. For Christina, it’s not about the technology, it’s about who is in control of it. She wants workers to question what they are being asked to accept as the new normal of workplace tracking. “Digitisation is here, so data will be created,” she says. “But who should have control and access over that data? Why exclusively the employers?” Read the full article here: https://fivemedia.com/articles/employers-are-tracking-us-lets-track-them-back/
- Anti-worker Facebook?
Facebook has ignored our calls for explanation and has brutally closed our Facebook page. Tech for Good - a mission of mine. In a Lab I ran, we developed a self-tracking privacy-preserving tool for workers. It's called WeClock. It's aimed at helping workers log and campaign on their modern conditions of work. No third party data access - no secret extractive data practices. The data is the worker's until he or she chooses to share it with whom they want. Their union, their organiser. A campaign group. Facebook A/B Testing We ran the first round of Facebook A/B testing to find the name of the app: WeClock won above the other 4 suggestions we tested. The test reached people in 17 countries of all ages. We were just about to launch our second round of testing - this time to see how we can reach our core audience. All set - ads made. We send them off to Facebook for review. Here's a sneak peak at two of them: And then what? Facebook bans our page!!! No pre-warning, no explanation other than a generic, probably bot-generated one. No chance to delete an ad, or correct it. A full and outright ban. We reply and ask for an explanation. See our email to them and their reply back: So now what? It's time we kick up a fuss. WeClock is an open source, privacy-preserving tool built for workers, by workers. It's a tool to help to log, share, analyse and get the worker's stories told. I cannot but help suspect that Facebook here is blocking us for being a worker empowering tool. We will continue and follow up. In the meantime, please help us and retweet our messages. Follow @weclockit and @cjcolclough on Twitter
- Data for Unions
In this webinar from July 16, 2020 organised by Unions21, Andrew Pakes, Prospect and Dr Christina Colclough discuss what tools unions could use to boost (or begin) their data storytelling journey. Hear about: WeClock Lighthouse Digital Maturity Frameworks And why unions must urgently work together to champion a new digital ethos that puts workers at its core.
- Govern That Data - Here's How!
Click here for my comments to the use of an online tool called Lighthouse that we developed to help you, and your organisation, become stewards of good data governance! More and more trade unions and civil society organisations are embarking on gathering and using data. This raises a number of highly important governance issues. Is the data appropriate to the cause? Who in the organisation has access to it? What is it used for, by whom and when? The UK union Prospect has worked with a leading data governance expert and has created an online tool for all to use. The online guide - or quiz as Prospect calls it - is appropriately called Lighthouse. It will help you navigate through the potentially stormy and dangerous water of becoming a datafied organisation. Its not hard to imagine the severe consequences for organisations if there is a data breach, or a misuse of personal, or personally identifiable information. Member-driven organisations often hold what in the GDPR, art 9.1 is called 'special category data'. Such data requires more stringent data processing arrangements. But unions also hold highly important data such as job titles/categories, contracts, wage levels, career movements, members' age, location, workplace and much more. Being vigilant around the handling of these data is more important than ever. Lighthouse at work The introduction sets the stage: "Why do this now? As digital tools play increasingly vital roles in shaping our societies, allocating our rights and resources, and fueling our economies, thoughtful approaches to digital governance are more critical than ever." The quiz has six main sections: 1. Write a Plan, 2. Build a Community, 3. Handle Data, 4. Assign Responsibility, 5. Write Rules and 6. Manage Risk Under each, you get to carefully choose where you data project/organisation is in relation to the questions. Its tempting to aim for the delighted face, but try to select the option that most resembles the situation for your project! You are guided through the 6 sections even invited to take a break half way through. When you are done you will see your overall score and get great tips as to where, why and how you could improve your data-governance to really make a positive change. Made for Unions This guide is made for unions. It takes its point of departure in the working realities of modern day unions as they become more and more digitalised. It is truly amazing that Prospect offers the guide to the wider world for free to use and get inspired by. In line with well - good data principles - the tool is privacy preserving using no cookies and no tracking. Andrew Pakes, Research Director of Prospect Union, reflects: "Since the GDPR came into force, we have worked quite intensely on improving our data handling, storage and use. It has, to be honest, not always been an easy task as there were/are quite a few disagreements about how to interpret some of the provisions in the GDPR. Lighthouse gives us the right foundation to think sensibly and carefully about how we use the data we have, and improve our own internal practices." I couldn't help following up with a big Y question, namely: Why have you made the tool public for all to use?.. Pakes smiled and responded: There is no end to the consultants who want to help us unions become more digitalised. But they often cost more than we possibly could pay. We were fortunate to be part of a project that enabled us to work closely with one of the world's leading experts on responsible data governance, Keith Porcaro, and get Lighthouse tailor made for us. So why not share? Stronger together. I can only salute this willingness to share and grow together. What might seem like a dull inward looking topic, is a really important part of the digital transformation of all organisations. The quiz ends by summarising your scores and offering heaps of good advice, literature and links to learn more. Kudos to Prospect for sharing this wonderful tool
- Are you involved? Data Protection Impact Assessments at Work
This blog argues that workers in the GDPR zone have a potentially very powerful tool at their disposal to govern and oversee the use of technologies in workplaces. That is assuming that the GDPRs provisions are followed. The problem is, they seldom are by the employers, and the unions will need to push for, and exercise, their rights. Under the auspices of the European General Data Protection Regulation (article 35), organisations are obliged to conduct a Data Protection Impact Assessment (DPIA): "Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. In more plain language this means that a DPIA must be made when the processing of data is likely to result in a high risk to individuals. The European Data Protection Board (EDPB) elaborates, and recommends that each country make reference to, and follow, the Working Party Guidelines regarding DPIAs, and to require DPIAs if any two of the following nine criteria are present: evaluation or scoring (which would include employee performance evaluations and applicant evaluations); automated decision making; systematic monitoring; sensitive data or data of a highly personal nature; data processing on a large scale; matching or combining data sets; processing data of vulnerable subjects, which include children, the elderly, and employees; innovative use or application of technological or organizational solutions, such as using fingerprints or facial recognition for physical access control; and processing that “prevent[s] data subjects from exercising a right or using a service or a contract.” It is not hard to reach two of the nine in many workplace technologies. Point seven is a given as we are talking about employees. Automated or semi-automated employee performance evaluations or automated/semi-automated hiring systems are also becoming more widespread. Systemic monitoring has been reported to be on the rise, and accelerated by the Covid-19 crisis (https://t.co/RZSgXsTKsF?amp=1, https://www.bloomberg.com/news/features/2020-03-27/bosses-panic-buy-spy-software-to-keep-tabs-on-remote-workers). Or simply when new technologies are used to collect, process, and manage data (which indeed most digital tools do). Two important (but oft overseen) requirements So we have established that DPIAs will have to made by employers, and actually quite regularly so. We have also established that DPIAs must be conducted when they include the processing of employee data or when new digital technologies are introduced. There are however two more requirements that all workers, shop stewards and unions should pay particular attention to, namely 1. that your views should be sought, and 2. that DPIAs should be revisited and reevaluated periodically. Lets take point 1 first: The Guidelines stipulate that the data controller (the employer in the case of workplaces) must: “seek the views of data subjects or their representatives” (Article 35(9)), “where appropriate”. The Working Party considers (page 15) that: - those views could be sought through a variety of means, depending on the context (e.g. a generic study related to the purpose and means of the processing operation, a question to the staff representatives, or usual surveys sent to the data controller’s future customers) ensuring that the controller has a lawful basis for processing any personal data involved in seeking such views. Although it should be noted that consent to processing is obviously not a way for seeking the views of the data subjects; - if the data controller’s final decision differs from the views of the data subjects, its reasons for going ahead or not should be documented; - the controller should also document its justification for not seeking the views of data subjects, if it decides that this is not appropriate, for example if doing so would compromise the confidentiality of companies’ business plans, or would be disproportionate or impracticable. Have you ever been asked? Now I would not suggest that workers accept the wording "a question to the staff representatives". A DPIA requires much more than a single question. I would urge you to push for far more. A pertinent point to make is, have you ever been asked? It would be wrong to assume that an impact assessment is truthful in its evaluations if you are not involved. Risk is relative to the law, but also to lived experiences. Risk to whom? To the workers? To their privacy rights, human rights, sense of dignity? A truthful impact assessment would involve and consider all voices of relevance to what is under study. Have you ever been party to a re-evaluation? The second point relates to the periodic reevaluation of DPIAs. The Working Party adds on page 20: "periodically review the DPIA and the processing it assesses, at least when there is a change of the risk posed by processing the operation;" A reevaluation of an impact assessment might sound boring at first, but it is actually really important. Some AI systems or algorithms learn to learn. They are self-adjusting. We have heard of the cases where a bot learnt to swear (bbc.com/news/technology-35890188), or an automated hiring system learnt to only hire men (shorturl.at/iouAV). So what seemed like a low risk yesterday, might well be a high risk going forward. Potentially, these risks could be related to your human rights, your rights to organise, your rights to equal treatment. You should and really must demand to be party to the re-evaluation. Where to go from here? In a world where our fundamental rights are at stake, the issue of governing AI and all digital technologies should be a top priority for workers and their unions. Yes this is a new field in which we need to build expertise. But lets ask the question: what will be the consequences if we don't get involved? We will blindly trust that management has understood our needs, fears and risk assessments. We will also through our non-action blindly trust an algorithm, a commodity, a thing, to do what is fair and good. But fair for whom we must ask? Anti-discrimination law will be in vogue soon again. Many of the national data protection authorities have guidelines on how to conduct a DPIA. See the UK's site here. Informative as it is, it is weak on guidelines to workers, and to the involvement of workers in DPIAs. The Danish one doesn't seem to include these provisions at all. So my call to action would be to urge unions and their representatives to prioritise these issues and utilise the rights to be heard that the GDPR gives you. Companies have a duty to involve the shop steward or workers. Its about time they get reminded.
- Future of Work - data, AI & workers
In this recording, Andrew Pakes of Prospect and Dr Christina Colclough speak about why workers should be involved in the governance of data-driven and data-generated systems at work. Recorded at the Digital Leaders conference on June 18, 2020, we look at where workers fit into the future of work. Who are AI and new technology accountable to? And who decides?
- Rewarding Work
Click below for short narration on the concept of Rewarding Work The concept 'Rewarding Work' frames what work, both now as well as in the future, should be all about. Its an evolution of the concept of Decent Work. Rewarding work is both a verb, an action: something we as an individuals, or as a society, do. We reward the work of others. Either through praise, appraisal, good working conditions or through monetary rewards (pay), or that we honour and reward the environmentally-friendly nature of the work. Rewarding work is also an adjective - it qualifies the noun "work". Work can be decent, stressful or exploitative - it can also be Rewarding. As an adjective, Rewarding Work is a feeling, an attitude about ones own work. Do I feel fulfilled? Satisfied? Seen and appreciated? Defining Rewarding Work Work should be Rewarding in many forms: financially (all workers should earn at least a living wage, rents should be controlled); socially (all workers should have the right to thrive at work but also outside of work, be part of a collective, have the possibility to claim their rights and have them enforced); emotionally (no worker should be abused, exploited, harassed or lonely at work. Nor should workers suffer mentally due to exploitative working conditions and contracts); environmentally (we all should care about our environment, not least should the companies we work for); contractually (all workers should have a contract stipulating their rights and the company's duties in relation to these rights. This covers not least workers' digital rights); intellectually (all workers should feel their competencies and skills are respected and desired, and that competency growth is a natural part of work) digitally - (no worker should feel that their fundamental rights are exploited at work. All workers should have agency and influence over digital technologies used by the employers. This includes establishing collective data rights for workers and having systems in place for the co-goverance of digital technologies at work) Work should be fulfilling. Imagine if it was so None of the dimensions of rewarding work mentioned above are impossible or far-fetched. If we chose to make it so, we can. Let's for a second loosen us from the shackles of the present and imagine a rewarding world of work. What would it look like? The working poor would be a concept of the past. Zero-hour contracts too; We would have a balance of interests in the market moving beyond narrow measures of productivity and efficiency into a broader definition of success along the lines of Rewarding Work; Forced self-employment would be stopped; not least because the economic incentives to put folks onto precarious contracts will be removed; Workers mental health will be vastly improved; We will know that the labour we pour into work is not ultimately detrimental to our mother earth; We will have structures and institutions in place to co-create a digital world of work so it empowers, not enslaves, workers and indeed society; The informal economy, which in some countries makes up to 93% of all jobs, will shrink. Domestic work will be valued once more Tech could help And a lot more could be written here, and will be as time goes by. One thing I wish to add already now, is that a responsible application of digital technologies could serve to support the enforcement of rewarding work. This will require a strong, trustworthy state and new sets of regulation to promote the dimensions of rewarding work.
- Negotiating workers' data rights
Click below for an extended commentary of why we must begin to negotiate workers' data rights. Companies/organisations increasingly collect, use, analyse, store and sell data and datasets. This occurs through the use of any (semi) autonomous digital system such as surveillance and monitoring tools, automated hiring systems, scheduling tools or robots. The question is: what rights do workers and staff representatives have to this data, and the inferences made on them? To give an idea of where unions could set in to strengthen workers data rights or where regulation needs to be improved, I have put together this figure: the data life cycle at work As Professor Sandra Wachter so excellently explains in this long-read called the "Right to Reasonable Inferences" workers and citizens have very few, if any, rights over the inferences made on them. Her article is a must read, and establishes why - in my case - workers must negotiate for much stronger #datarights. Use the figure below to get inspired to negotiate on the data lifecycle at work. #datalifecycle #datarights
- AI Principles
Written for UNI Global Union in 2017, this set of AI principles was amongst the first to be written As Artificial intelligence (AI), robotics, data and machine learning enter our workplaces across the world displacing and disrupting workers and jobs, unions must get involved. This document provides unions, shop stewards and workers with a set of concrete demands to the transparency, and application of AI. It will inform AI designers and management of the importance of worker inclusion. There is a definite urgency of now. Action is required by unions, multinational worker alliances, to safeguard workers’ interests and maintain a healthy balance of power in workplaces. #AI #ethicsinAI #AIprinciples
- Workers Data Rights
This document fills an enormous gap with regards workers' rights in the new world of work. Written for UNI Global Union in 2017, it is one of the world's first policies for improving workers' data rights. As management increasingly uses data to hire, fire, promote and discipline workers, we demand the right of explanation as to what data they use, how they store this data, where they got it from, and what they will do with it. Without these rights, we will forever be subject to unilateral data-informed managerial decisions. Giving you 10 operational principles for workers' data rights and protection slot these rights into collective bargaining, Global Framework Agreements and multinational alliances #datarights #dataproduction
- Data Cooperatives
Digital Empowerment of Citizens and Workers This white paper argues that we must move from an individualized asset-based understanding of data control to a new collective system based on rights and accountability. It argues for a key role of trade unions and credit unions in this process A Whitepaper by Pentland, A., Hardjono, T., (MIT Connection Science) Penn, J., Colclough, C., (UNI Global Union) Ducharme, B., Mandel, L. ( MIT Federal Credit Union) During the last decade, all segments of society have become increasingly alarmed by the amount of data, and resulting power, held by a small number of actors. Data is, by some, famously called ‘the new oil’, and comes from records of the behavior of citizens. Why then, is control of this powerful new resource concentrated in so few hands? During the last 150 years, questions about concentration of power have emerged each time the economy has shifted to a new paradigm; industrial employment replacing agricultural employment, consumer banking replacing cash and barter, and now ultra-efficient digital businesses replacing traditional physical businesses and civic systems. As the economy was transformed by industrialization and then by consumer banking, powerful new players such as Standard Oil, J.P. Morgan, and a handful of others threatened the freedom of citizens. In order to provide a counterweight to these new powers, citizens joined together to form trade unions and cooperative banking institutions, which were federally chartered to represent their members’ interests. These citizen organizations helped balance the economic and social power between large and small players and between employers and worker. The same collective organization is required to move from an individualized asset-based understanding of data control to a collective system based on rights and accountability, with legal standards upheld by a new class of representatives who act as fiduciaries for their members. In the U.S. almost 100 million people are members of credit unions, not-for-profit institutions owned by their members, and already chartered to securely manage their members’ digital data and to represent them in a wide variety of financial transactions, including insurance, investments, and benefits. The question then is, could we apply the same push for citizen power to the area of data rights in the ever-growing digital economy? Indeed, with advanced computing technologies it is practically possible to automatically record and organize all the data that citizens knowingly or unknowingly give to companies and the government, and to store these data in credit union vaults. The MIT Trust Data Consortium has built and demonstrated pilot versions of such systems already. In addition, almost all credit unions already manage their accounts through regional associations that use common software, so widespread deployment of data cooperative capabilities could become surprisingly quick and easy. As a consequence, it is technically and legally straightforward to have credit unions hold copies of all their members’ data, to safeguard their rights, represent them in negotiating how their data is used, to alert them to how they are being surveilled, and to audit the companies using their members’ data. The power of 100 million US consumers who are practically and legally in control of their data would be a force to be reckoned with by all organizations that use citizen data and would be one very decisive way to hold these organizations accountable. The same potential for credit unions to balance today’s data monoliths exists in most countries around the world. It is critical to note that credit unions and similar organizations have fiduciary responsibilities to protect the sensitive information that is shared by members, as this is a central element in bringing data rights to the membership. As a consequence, members will gain privacy, transparency, and empowerment as to data use and can direct the use to their collective best benefit as they see fit. Who will lead this historic, and necessary transformation? The answer could well be trade unions. Many credit unions are directly associated with trade unions, have the same membership pool, and are chartered to represent their members in transactions related to their employment. Worldwide, this transformation could be led and coordinated by national and global trade unions. A critical first step will be to affirm worker’s data rights into legislation and regulation through primarily collective bargaining but also legislation and regulation. Such rights would protect against manipulation, discrimination, and unreasonable surveillance. This dialogue would also address those aspects of decent work that should not be quantified. The ability to balance the world’s data economy depends on creating a balance of stakeholders. Today citizens and workers have no direct representation at the negotiation table, and so lose out. By leveraging cooperative worker and citizen organizations that are already chartered in law virtually everywhere in the world, along with technology that has already been demonstrated, we can change this situation and create a sustainable digital economy that serves the many, and not just the few. #whitepaper #datatrust #datacooperatives
- Mind the App
The Covid-19 crisis must not lead to a watering down of human rights and workers’ rights in favor of quick fix solutions. Article co-authored with Profesor Valerio De Stefano and first published on BotPopuli. Read excerpt below. Get full version here) Main points: Whilst it is pertinent that our economies and societies begin to open up, without additional measures in place, contact-tracing apps could create a false sense of security, putting human lives at risk. Sound governance mechanisms need to be put in place urgently, together with an increased Covid-19 testing of many more citizens before economies are hastily opened. There is also the need for far more cooperation between the state, employers, and unions prior to and during any app roll-out and implementation. If This Is About the Economy, Involve Businesses and Unions/Workers It is easy to assume that governments are supporting and hastily launching contact-tracing apps as a means to open up businesses again and get workers back to work. If this assumption holds true, it will be vital to include employers and unions in the planning, rollout, and deployment of these apps. In this regard, the following conditions, in addition to the ones above, should immediately be considered: That governments establish a tripartite body aimed at evaluating the apps’ risks and implications on return to work, including on workers’ physical and mental health. That any aggregated data be made available to employers and unions for review and interpretation. That whistleblowing systems are established where persons safely can report misuse or abuse of the app’s intentions. Sanctions should be established for breaches of these conditions. In workplaces, the following conditions must be applied: To maintain that the use of location-tracing apps is voluntary, employers must be prohibited from demanding that a worker downloads and uses the app as a precondition of return to work. This includes through seeking “informed consent” from the workers individually and/or collectively. That no worker should be forced to hand over app data to their employer as a form of monitoring. That workers have a right to share app data with their union for evaluation and risk assessment. That all existing health and safety rules and agreements are followed. Whilst time is of essence, caution is too. This crisis must not lead to the watering down of human rights and workers’ rights in favor of a quick fix app-based solution set on a very weak foundation. We need proper sunset clauses, transparency, and auditability arrangements, as well as institutional frameworks and infrastructure to deal with this. But first and foremost, we need much more testing.
- When Algorithms Hire and Fire
Take a second and consider whether you would you have your job, if an algorithm had been in charge of hiring you? (this article first featured in ICTUR volume 25, issue 3, 2018) Take a second and consider whether you would you have your job, if an algorithm had been in charge of hiring you? Think about your financial records, your health file, your friends on social media. Are you a member of a trade union? Do you own a Fitbit? What are your shopping habits and what do you do in your spare time? And then ask, how would all of this effect your work life? Would you get hired, fired, disciplined or promoted? What seems like a bizarre question, is in fact one that we all need to think about and react to. ‘Management-by-algorithm’ is spreading, and more and more data from many different sources is used in HR processes. Critically, across the world, bar to a certain extent in Europe, there are very few regulations in place that protect the misuse of workers’ personal data in and by companies. Trade unions must fill this regulatory gap and put workers’ data rights on the agenda to hold management and governments accountable and responsible. What’s all this about data? We are leaving a data trail behind us all the time. From our social media profiles, our likes and posts, to customer service phone calls, visits to the doctor, use of our GPS or cash withdrawals from the bank. We acceptingly give away our names and email addresses when we log on to free Wi-Fi hotspots in cafes, airports or train stations and we more or less have become so accustomed to “free” digital services that we almost get irritated when a mobile app costs a dollar. The thing is, nothing is free. What we have been doing and still are doing, is freely and oftentimes willingly giving away our location, habits, activities and opinions. In other words, we are paying with our data. Surveillance, manipulation and algorithmic control Whilst our eyes have been slightly opened by the revelations of how data was used to target and manipulate voters such as in the US election and the Brexit result, politicians and experts afford very very little attention to how data is used, and potentially misused, in relation to work. There is a sharp rise in the use of algorithms, data and artificial intelligence (AI) in human resources and productivity planning. Companies are popping up that offer AI solutions to cut costs on dealing with people. From autonomous sorting of job applicants and applications, to the use of extensive data to measure productivity, to employee mood testing, to ways to automatically find out what motivates you and much much more. UNI Global Union is working on these issues across the world. We are discussing how we, the unions, can tap into the significance of datasets and benefit from the insights they can offer. We are raising our voices against the monopolisation of data ownership and asking whether data should be made a commons. A public good that can be accessed by us all. One thing is to protect our fundamental rights, the other is to take that one step further and demand a collective ownership of data. Both are equally important. We have also written two key documents, namely, the Top Ten Principles of Workers’ Data Privacy and Protection and the Top Ten Principles of Ethical AI. The documents are interrelated and list the essential demands we must put in place to avoid a future where workers are subjected to algorithmic decision making that is beyond human control and insight. Act Now! Unions across the world must address these fundamental issues. We simply cannot rely on others to do so. Digital technologies are developing at great speed, and our ethical demands to them must be clear. We cannot risk that people are prevented from working or thriving in the labour market due to an algorithm that nobody claims to control, and nobody can rectify. UNI Global Union believes that a collective ownership of data, ethical AI and workers’ data rights are the key issues for unions. We must commit management as well as governments to take responsibility. Only by doing so can we ensure a digital world of work that is empowering, inclusive and open to all. Read the full article here See the whole magazine here #AI #algorithms #datarights #inferences
- Economic Rights in a Data-Based Society
Webinar Organised by PSI, the global union for public services, IT for Change and FES, this webinar discusses why our collective future depends on whether we can ensure broad sharing of digital data with due individual and group protection. Recorded May 19, 2020