COLOUR – workshop diversity and inclusion
Details
Photography project
COLOUR is a database with detail photos of human skin. Clothes, jewelry, scars and tattoos play an important role in these images.
In the database different skin colours, ages and cultural backgrounds are represented.
Diversity and inclusion workshop
This database forms the basis of the diversity and inclusion workshop “Algorithmic Thinking with Photography”.
In this workshop, participants label photos from Nieuwenhuize’s photo project in various rounds. By the different rounds of labeling the workshop raises awareness about the bias in AI.
Group dialogue
In an open dialogue at the end of the workshop the group discusses how they were confronted with their own bias. The differences between the rounds of labeling and the role the participants’ identities and their prejudices play while labeling photos are being discussed. It is discussed whether a different perspective emerges among the participants in the different rounds of labeling. Also the principles and applications of AI, the bias in AI come to the table. Ultimately it is discussed how the bias in AI impacts diversity and inclusion within organisations.
What do participants get out of it? What is the take away?
This workshop is mainly about awareness and has different interesting aspects.
Participants of the workshop:
- get confronted with their own prejudices
- experience firsthand how data labeling works
- experience the role their own identity plays while labeling data
- learn about AI technology and “clickwork”
- become aware of the exclusion mechanisms and bias of AI
- get an insight in how AI has an impact on the diversity and inclusion within organisations
- experience AI in an interesting playful and hands-on way
- collaboratively work on a temporary improvised exhibition
Background
An algorithm is a step-by-step plan consisting of a set of rules in a fixed order to get to a solution or achieve an end goal. Advanced self-learning algorithms are part of artificial intelligence, AI.
AI is a useful tool for both simple and complex problems. For example, AI is used in camera apps on smartphones and in chatbots.
On social media, AI takes your personal preferences into account. It analyses your interactions, such as the posts you like, the accounts you follow and the time you spend on each video you watch. The content you are being offered is increasingly tailored to your personal preferences. That is convenient and comfortable, but the selection you see also becomes increasingly one-sided. You end up in a so-called filter bubble.
Apart from its convenience, AI also has a dark side. Using it can have far-reaching consequences, as the Dutch childcare benefits scandal (de toeslagenaffaire) made painfully clear. The Tax Authorities worked with an automated risk selection system based on AI. This system determined which applications required additional checking. AI allegedly have been able to use information that is legally irrelevant in decision-making, such as gender, religion, ethnicity and address. (1.)
Due to this bias, the chance of being singled out by the algorithm was greater for applicants with a second nationality. As a result, many people were wrongly forced to pay back their full benefits.
“The Dutch childcare benefits scandal shows that we need explainable AI rules” (translated from Dutch), www.uva.nl, 13 Februari 2023, retrieved May 2024. https://www.uva.nl/shared-content/faculteiten/nl/faculteit-der-rechtsgeleerdheid/nieuws/2023/02/de-onthulling-van-het-kinderopvangtoeslagschandaal-kan-betekenen-dat-nederland-vooroploopt.html (1.)
“Johan’s workshop made me realize that I am unconsciously biased when categorizing. That how I think about something is by definition coloured.
Very interesting!
The workshop actively encourages to reflect on the power and possibilities of AI.
A must for anyone who is curious about what is already has a large impact on our lives.”
– workshop participant, June 15th, 2024
“The workshop made clear to me the arbitrariness and in particular the non-arbitrariness behind the omnipresent algorithms and AI that we encounter in our daily lives.”
– workshop participant, June 15th, 2024
The workshop includes:
- sensitiser
- three rounds of labeling
- group dialogue
The duration of the workshop is 1-2 hours, depending on the group and the nature of the dialogue afterwards.
Customized workshops are available at request.
Participants of the workshop:
- get confronted with their own prejudices
- experience firsthand how data labeling works
- experience the role their own identity plays while labeling data
- learn about AI technology and “clickwork”
- become aware of the exclusion mechanisms and bias of AI
- get an insight in how AI has an impact on the diversity and inclusion within organisations
- experience AI in an interesting playful and hands-on way
- collaboratively work on a temporary improvised exhibition
- receive an afterburner email with links to articles about the bias in AI and related for further reading
Investment
The investment for the diversity and inclusion workshop “Algorithmic Thinking with Photogaphy” is €95,- per person, with a minimum of eight participants.
This includes a preparatory meetin, materials and afterburner email. The price is excluding vat and traveling costs.
Contact Johan for a quotation and availability at johan@johannieuwenhuize.nl.