Exposing Algorithmic Bias – A ‘Concept’ Game Jam

On the 29 November 2023, the Lab hosted its inaugural ‘concept’ game jam, co-organised with the Centre for Creative Technologies and sponsored by MyWorld. The theme? Exposing Algorithmic Bias. The time scale? Four hours.

The idea for this game jam came from Professor Ed King’s project Challenging Algorithmic Racism through Digital Cultures in Brazil. In Ed’s words, “The project interrogates how cultural practices, including video game design, can be used to challenge the ways in which, despite their sheen of neutrality, new technologies often reproduce existing social biases and power hierarchies.”

From education and health to financial services and facial recognition, algorithms have become key components in scaling decision making. The danger, of course, is they can embed and augment existing biases, or even generate new types of bias within complex systems. This danger is only amplified by the application of machine learning and AI.

The aim of this condensed game jam was to think about how the mechanisms of gaming and play can expose these processes. Teams were not expected to create a fully fledged game within the time limit. Rather, the event was about exploring the potential of game design. We had over 60 people sign up to take part, and on the day eight teams worked on six games.

Below, you can enjoy a selection of the games and concepts that the teams worked on, as well as a video interview about the jam. For more about the inspiration behind the theme, check out Professor Ed King’s Intro to the jam or our Tackling algorithmic biases through gaming article.

Investigating Bias in LLM-Based Translation

Our idea was to investigate how large language models (LLMs) may introduce bias as they translate from Chinese to English. The game allows you to translate a story, one page at a time, from Chinese to English using an LLM. You then have to answer questions about the story, considering the bias that the LLM has introduced as you do so. You can play the game on Itch.io.

Lost in Translation

Our concept explores the algorithmic bias in system design and user interfaces. By putting humans into a scenario where they all experience algorithmic bias, we try to demonstrate to those who don’t experience it in their day-to-day lives how it works and feels, thus improving empathetic understanding. Furthermore, we have tried to expose the bias blindspot – where one sees themselves as less biased than others – to try and demonstrate functions of algorithmic bias without trivialising the issue by emulating it directly. By forcing people into a position where their assumptions are challenged by the fact everyone can experience algorithmic bias, we highlight the work that needs to be done to avoid it.

Lost in Translation design brief

AI concept art

Hire Intelligence

Our game is inspired by Amazon’s failed 2018 experiment in using algorithms for recruitment, which was scrapped after the algorithm consistently prioritized male applicants and deprioritized terms like ‘women’s chess club.’ In this game, you play as a job applicant for a randomly generated position-your aim is to ‘game the system’ by convincing the algorithm to give you the job.

Hire Intelligence

Algorithmic bias in Streaming preferences

Built in Excel, this game is designed to explore algorithmic bias in the selection of streaming content.

Streaming bias

“HDWGH?” is an ASCII aesthetic text-based adventure game. -USER-, a [currently] unidentified consciousness in our post-apocalyptic futures, is trying to figure out HOW DID WE GET HERE?

In a post-apocalyptic world where the majority of humanity has uploaded their consciousnesses into the huge central dataset located in the North Pole after a mysterious global disaster around 2110. There are no longer other living witnesses or material evidence to prove what really happened in the 21st century. The uploaded intelligences lived a short time in the digital paradise that was promised them before their memories become fragmented, corrupted, erased by mysterious forces, and, eventually, disappeared, leaving only clusters of ‘memory balls’ that are stored in ignored corners in the dataset.

You, -USER-, are the only remaining coherent consciousness in the dataset. You are suddenly woken from slumber one day and tasked with the mission to collect these fragmented ‘memory balls’ scattered in the system, relive the memories stored in them, solve the embedded mystery with the clues you can find, and find out what led to the current pitiable situation of humanity. In the process, you will also find out who you really are, and eventually decide what your mission is, which may have a chance of reversing what has happened to humanity.

You travel, via the ‘memory balls’, to the mid-21st Century conurbation of McTownship, exploring its institutions, looking through the eyes of its citizens & feeling their frustrations as the algorithms wreak increasing havoc in their lives. Can you identify the biases? Can you help the citizens? Can you find a way to turn all of this around?

Categorize This!

In this game, you play as a character whose job is to sort unidentifiable objects. The more you sort, the more it transpires that you are affecting the world around you. You realize that, by categorizing data, you have introduced a whole range of biases.

AI concept art

Leave a Reply

Your email address will not be published. Required fields are marked *