52 results found

Blog Posts (44)

  • Det handler ikke kun om dig..

    Den nyeste artikel i serien om dataficering af vores arbejde og arbejdspladser er publiceret i dag. Skrevet for Forsikringsforbundet i Danmark. "Det handler om, at vores liv kan formes og begrænses af en masse dataprofiler, vi slet ikke kender til, fra personer vi sikkert aldrig har mødt. Omvendt påvirker profiler, der bygger på vores egne handlinger, andre. Vi skal have nogle rettigheder over de profiler. Vi skal vide, hvad de er, og hvad de bruges til, kunne modsætte os dem. Få dem rettet til eller ligefrem slettet. Og vi skal kunne forbyde, at de bliver solgt til tredjepart." I artiklen argumenterer jeg for hvorfor en stærk faglige respons er påkrævet for at beskytte medarbejderens menneskerettigheder, privatliv og retten til at forme of skabe deres liv frie for algoritmisk manipulation. Læs hele artiklen her:

  • Audits & Impact Assessments 2.0

    Across the world, a new wave of audits and/or impact assessments for digital technologies are popping up. None include workers in the process. This must change, argues this blog. Jonathan Guy from the Australian Education Union (AEU) argues in this powerful article why unions need to be party to audits concerning the need, use and impact of digital technologies. Jonathan provides a convincing case: As a response to the closure of schools due to the pandemic, the Australian Federal Government lowered the price of broadband. But research the AEU had just conducted showed that although a small proportion of all students in Australia (5%) do not have internet access on any device, public school students are overrepresented among those without access: 125,000 of them have no internet access on any device and they were 2.5 times more likely than private school students to have no internet access at home. Aboriginal and Torres Strait Islander public school students were four times as likely as non-Indigenous students to have no internet access at home—21% vs 5%. The study also revealed that almost a third of students living in very remote areas have no internet access. Students from low-income households, 80% of whom attend public schools, are 9 times more likely to lack internet access at home than students from high income households. Guy, rightfully, remarks: the government initiative is of little use to those without devices or existing broadband connections. Jonathan Guy calls for digital equity audits that should be carried out at a national level together with education unions in order to provide evidence for comprehensive action plans. They must also take into account the relationship of COVID-19-related remote learning and ongoing disadvantage due to: lack of digital inclusion, potential long-term impact on the achievement of students by home internet access, family income, remoteness, mobility, family type, English proficiency, disability, housing, and Aboriginal and Torres Strait Islander status. The point here is that had the Australian government reached out to the very unions who know what is at stake in the education sector, their federal solution would have been far more nuanced. Back in Vogue The call from Jonathan Guy is significant, and unions from all sectors should echo it. However, whilst audit and impact assessments are back in vogue across the world, and many models are being created, none, simply none, include the workers and their unions. The Ada Lovelace Institute published a report in 2020 "Examining the Black Box" - it includes this overview of assessing algorithmic systems. Again none include the workers nor the unions: ForHumanity - an all-volunteer organisation aimed to bring together experts who are convinced that mitigating risk from the perspective of Ethics, Bias, Privacy, Trust, and Cybersecurity in autonomous systems will lead to a better world. They created a taxonomy that distinguishes between 3rd party independent audits, internal audits, assurance, consulting, and more. Again nothing is mentioned about the employees/unions' role in participating in, or approving, audits. Unions must respond Whilst especially industry is pushing for - albeit slightly improved - audits and impact assessments, we should also expect they are doing so in the hope to avoid more intrusive regulation. In my work in the OECD One AI expert group and elsewhere, I have read a growing number of company audit and impact assessments. None of them include the workers' voice. This even within the EU, where companies are actually obliged to create data protection impact assessments (DPIAs) and to consult with the workers when digital technologies process workers' personal or personally identifiable information. So what's the problem? Whilst the majority of the audits I have seen actually include articles on human rights, social, fairness and equity impacts, if these audits are conducted by management alone, it is - honestly - hard to take them seriously. We must, for example, ask "fair for whom"? For an individual, for groups, what groups? What is fair for one group might be very unfair for another. What compromises is the company making, and do the workers agree? If not, what are the remedies? The failure to meet the real challenges in Australia as portrayed in Jonathan Guy's blog, so clearly shows how the government response could have been far better had AEU be included. By excluding the staff reps/shop stewards, and therefore the voice of the workers, companies risk approving potentially highly discriminative algorithmic systems. Trade unions have traditionally been the guardians of inclusive and diverse labour markets. The staff reps/shop stewards are also those closest to the workers. They know the sentiments of their colleagues and the lived experiences of discrimination, privacy violations and exclusion. No workplace or labour market audit system or impact assessment is worth the paper it is written on, if the voice of the workers is not an equal partner in its formation.

  • Gig Workers Fighting Back

    New article in Wired features WeClock - our self-tracking app for workers - in article by journalist Aarian Marshall: "Gig Workers Gather Their Own Data to Check the Algorithm’s Math". Read full article here, excerpt below Uber Eats delivery worker Armin Samii found that the company might have underpaid him by not considering the route he had to follow. PHOTOGRAPH: MAIRO CINQUETTI/NURPHOTO/GETTY IMAGES Samii is a software engineer. So he created a Google Chrome extension, UberCheats, that helps workers spot pay discrepancies. The extension automatically extracts the start and end points of each trip and calculates the shortest travel distance between the two. If that distance doesn’t match up with what Uber paid for, the extension marks it for closer examination. So far, only a few hundred workers have installed UberCheats on their browsers. Samii doesn’t have access to how couriers are using the extension, but he says some have told him they’ve used it to find pay inconsistencies. Google briefly removed the extension last week when Uber flagged it as a trademark violation, but reversed its decision after Samii appealed. The digital tool joins others popping up to help freelancers wrest back control over work directed by opaque algorithms, with pay structures that might change at any time. The need has only grown during the pandemic, which has seen companies like DoorDash, Amazon, and Instacart hire more contractors to support spikes in demand for deliveries. The expansion of the gig economy might be here to stay: The US Bureau of Labor Statistics projects the “courier and messenger” sector could grow 13 to 30 percent more by 2029 than it would have without a pandemic. Globally, up to 55 million people work as gig workers, according to the research and advocacy group Fairwork. “Things are changed and hidden behind an algorithm, which makes it harder to figure out what you’re earning and spending and whether you’re getting screwed.” KATIE WELLS, RESEARCH FELLOW, GEORGETOWN The projects stem from practical need. In the US, many gig workers keep track of their miles and expenses for tax purposes. But the projects also grow out of workers’ growing mistrust of the companies that pay their wages. “I knew about gig companies’ business decisions that meant they weren’t paying well,” says Samii. But he says he hadn’t thought the apps might “pay for less work than you actually did.” The tools are particularly helpful to gig workers because of their low wages, and because it can be hard for isolated workers to share or find information about how the job pays, says Katie Wells, a research fellow at Georgetown University who studies labor. “Things are changed and hidden behind an algorithm, which makes it harder to figure out what you’re earning and spending and whether you’re getting screwed,” Wells says. ...... ...... Driver's Seat But some workers have been drawn to homegrown tools built by other gig workers—and the idea that they might themselves profit off the information that companies collect about them. Driver’s Seat Cooperative launched in 2019 to help workers collect and analyze their own data from ride-hail and delivery apps like Uber, Lyft, DoorDash, and Instacart. More than 600 gig workers in 40 cities have pooled their information through the cooperative, which helps them decide when and where to sign on to each app to make the most money, and how much they are making, after expenses. In turn, the company hopes to sell the data to transportation agencies interested in learning more about gig work, and pass on the profits to cooperative members. Only one city agency, in San Francisco, has paid for the data thus far, for a local mobility study that sent $45,700 to Driver’s Seat. .... .... WeClock An open source project called WeClock, launched by the UNI Global Union, seeks to help workers collect and then visualize data on their wages and working conditions, tapping into smartphone sensors to determine how often they’re sitting, standing, or traveling, and how they feel when they’re on the job. Once it’s collected, workers control their own information and how it's used, says Christina Colclough, who helped build the app and now runs an organizing consultancy called the Why Not Lab. “We don’t want to further the surveillance that workers are already subjected to,” she says. “We don’t want to be Big Tech with a conscience.” Colclough, The Why Not Lab Colclough hopes that, eventually, workers might use WeClock to show they’re working longer hours than agreed. For now, the app is being used by 15 freelance UK TV production workers, who say that production companies don’t always pay fair wages for all the work they do. The participants in the pilot use Apple Watches to track their movements while on set. “I love my job,” says one production sound crew worker, who is using WeClock. (The worker asked not be named, for fear of retaliation in a close-knit industry.) “But I hope this can help expose a little bit of the ridiculous hours we work.” Read full article on Wired: https://www.wired.com/story/gig-workers-gather-data-check-algorithm-math/

View All

Pages (8)

  • About | The Why Not Lab

    About the Why Not Lab The Why Not Lab is a boutique value-driven consultancy that puts workers at the centre of digital change. We offer our expertise exclusively to progressive organisations, trade unions and governments. The Why Not Lab has a two-fold mission to ensure that the digital world of work is empowering rather than exploitative. We: ​ Equip workers and their unions with the right skills, know-how and know-what to ensure collective rights in the digital age; ​ Put workers' interests centre stage in current and future digital policies ​ ​ Both are necessary. Politically, even in discussions on the future(s) of work, workers' interests are seldom heard or even considered. This must change. To bridge digital divides and prevent the objectification of workers that is currently underway, workers must be empow ered so they can table an alternative digital ethos. The Why Not Lab aims to support exactly this through our training, policy and strategic support. ​ The Why Not Lab is run by Dr Christina J. Colclough - a fearless optimist who believes that change for good is possible if we put our minds and heart to it. She works with experts and partners across the world to provide the best advice at all times. Read more about Dr. Colclough below Please note: We believe all workers have the right to Rewarding Work in the digital age. We have therefore adopted a differential pricing principle so we can support workers and organisations from all regions of the world. Do contact us with any inquiries. Dr Christina J. Colclough Regarded as a thought leader on the futures of work(ers) and the politics of digital technology, Christina is an advocate for the workers’ voice. She has extensive global labour movement experience, where she led their future of work policies, advocacy and strategies for a number of years. She was the author of the union movement's first principles on and the . Workers' Data Rights Ethics of AI A globally sought-after keynote speaker and workshop trainer with over 150 speeches and trainings the last 3 years, Christina created the Why Not Lab as a dedication to improving workers' digital rights. She is included in the all-time Hall of Fame of the world's most brilliant women in AI Ethics. ​ Trusted Positions ​ Christina is a Member of the Steering Committee of the Global Partnership on AI (GPAI) and she is Advisory Board member of Carnegie Council's new program: AI and Equality Initiative She is, furthermore, a member of the . OECD One AI Expert Group , the UN's Secretary General Roadmap for Digital Cooperation and is affiliated to FAOS , the Employment Relations Research Center at Copenhagen University. Testimonials John C. Havens . E.D., IEEE Global Initiative on Ethics of Autonomous & Intelligent Systems & Council on Extended Intelligence In an environment where rhetoric often rules all, Christina provides hard-hitting yet pragmatic and solutions-oriented counsel on issues including the future of work, human autonomy, human rights, and technology governance in general. ​ She is my "go to" person on any issues related to AI and the future of work based on her specialized knowledge of worker's rights and actual global policy and economics relating to these issues versus only aspirational techno-utopian ideals. She is also a gifted and personable speaker, transforming highly nuanced and complex technical and political issues into conversational, story-oriented speeches. What We're Reading

  • Podcasts on work in the digital age | The Why Not Lab

    Podcasts << Jump to Speeches Disrupted Asia - AI, data rights & futures of work 🎧 Podcast: Jan 26, 2021 Interviewed by FES in Asia for their podcast series Disrupted Asia. this podcast analyses the effects of digitalisation, Artificial Intelligence and data protection in the workplace. We talk about worker surveillance, the data lifecycle at work , algorithmic decision-making and union responses and ways forward for the labour movement in the Asia-Pacific region. A Brave New World of Work 🎧 Podcast: Oct 20, 2020 In this podcast, Dr Christina J. Colclough discusses with Simon Sapper what unions can, should and must do to safeguard their members in the digital world of work, and what’s likely to happen if they don’t! We tune in on the urgent need for union capacity building, for new negotiation strategies on the governance of AI and worker surveillance. The Future of Work in Education 🎧 Podcast: Jul 29, 2020 Listen up for why Dr Christina Colclough cautions that the current unfettered digitalisation of education is a fundamental attack on human rights and what unions across the world should both do and demand to shape a safer, better and inclusive future of education. Recorded by Martin Henry from Education International - the Global Union for 32.5 million teachers and other education employees across the world. Empowering Workers in the Digital Future 🎧 Podcast: May 29, 2020 Companies increasingly use digital systems to hire, fire, and monitor their employees (read article here ). But who is keeping employers in check? Former director at UNI Global, and one of the most influential thinkers in the ethics of AI, Dr Christina Colclough joins Azeem Azhar from Exponential View to explore how to ensure that the increasingly digital workplace of the near future protects workers. Yours, mine and our data ​ 🎧 Podcast: Feb 11, 2020 Recorded at a debate meeting at Kulturhuset in Oslo, Dr Christina Colclough takes the audience through an awareness raising journey on why we need to push for a new digital ethos that protects human rights, our right to be human, our data rights , democracies and more. This podcast is all about data... ​ ( ) starts in Norwegian but continues in English In Data We Trust? 🎧 Podcast: Sep 9. 2 019 Element AI’s Marc-Etienne Ouimette spoke with some of those leading the charge around taking back control of our data and the notion of — or as Dr Christina Colclough also calls them "workers' data collectives'" data trusts - with Ed Santow, Australia’s Human Rights Commissioner, Neil Lawrence, Professor of Machine Learning at the University of Sheffield and Dr Christina Colclough. The Robots Are Coming! 🎧 Podcast: Jul 22, 2018 Host Paul Dillon from the Office Block delves into one of the biggest topics facing people working in finance today. What are tech changes doing to our jobs? What will a future finance sector look like, as automation and digitalisation continue apace? What's all this about data and our rights? Featuring Dr Larry Stapleton, Dr Michelle O'Sullivan, Dr Lisa Wilson and Dr Christina Colclough Taming The Robots 🎧 Podcast: May 6, 2018 Are the fears for the future justified? How can we use this new technology to our benefit? There is no one better qualified or more articulate on this most pressing of subjects. In a heartfelt defence of the need for “ human agency ” in the industrial process, Dr Christina Colclough and Jonnie Penn set out not just why this is so important, but how we can make change happen. Digital Future of Work​ 🎧 Podcast: Dec 15, 2017 Listen to this podcast where Dr Christina Colclough explains why unions need to fight for workers' data rights and protection in a time where digital change is upon us. We must ask whether data, big data, algorithms and AI are taking the human out of human resource management - and we must know what is needed to keep the balance of power in companies. The podcast is made by Kvinneperspektiv, Nina Hanssen from LO-Norway.

  • Contact | The Why Not Lab

    I'm a paragraph. Click here to add your own text and edit me. It's easy. Let's Connect! Towards a Future of Rewarding Work We believe in the richness of diversity, equal opportunities and inclusive meetings, panels, speaker line ups etc. We urge all requestors to diversify their events as much as possible, and will happily recommend excellent folks in our stead. Submit Thanks for contacting The Why Not Lab. You will hear from us soon, Christina

View All

©2021 The Why Not Lab

Svendborg, Denmark

CVR nr: 42087025