• Issue #01
  • Issue #02
  • Issue #03
  • Issue #04
  • Issue #05
  • Issue #06
  • Issue #07
  • Issue #08
  • Issue #09
  • Issue #10
8
Contents
editorial
KOFI AGAWU
African Art Music and the Challenge of Postcolonial Composition
PAUL ZILUNGISELE TEMBE
China’s Effective Anti-Corruption Campaign
DILIP M. MENON
Changing Theory: Thinking Concepts from the Global South
BEN WATSON
Talking about music
Theme AI in Africa
blk banaana
An (Other) Intelligence
VULANE MTHEMBU
Umshini Uyakhuluma (The Machine Speaks) – Africa and the AI Revolution: Exploring the Rapid Development of Artificial Intelligence on the Continent.
OLORI LOLADE SIYONBOLA
A Brief History of Artificial Intelligence in Africa
CHRIS EMEZUE & IYANUOLUWA SHODE
AI and African Languages: Empowering Cultures and Communities
NOLAN OSWALD DENNIS
Toward Misrecognition. | Project notes for a haunting-ting
SLINDILE MTHEMBU
AI and documenting black women's lived experiences: Creating future awareness through AI-generated sonics and interpretive movement for the future of freeing suffering caused on black bodies.
ALEXANDRA STANG
Artificially Correct? How to combat bias and inequality in language use with AI
BAKARY DIARRASSOUBA
Bambara: The Jeli (Griot) Project
ROY BLUMENTHAL
Artificial Intelligence and the Arcane Art of the Prompt
AI GENERATED
"AI on Artificial Intelligence in Africa" and "Exploring its impact on Art and Creativity"
JULIA SCHNEIDER
AI in a biased world
MBANGISO MABASO
Bana Ba Dinaledi: Telling African Stories using Generative AI Art.
ALEX TSADO & BETTY WAIREGI
African AI today
BOBBY SHABANGU
Using Artificial Intelligence to expand coverage of African content on Wikipedia
DARRYL ACCONE
Welcome to The End of Beauty: AI Rips the Soul Out of Chess
VULANE MTHEMBU & ChatGPT
Hello ChatGPT - A conversation with OpenAI's Assistant
DIMITRI VOUDOURIS
Evolution of Sιήκ
STEFANIE KASTNER
Beyond the fact that most robots are white: Challenges of AI in Africa
MARTIJN PANTLIN
Some notes from herri’s full stack web developer on the AI phenomenon
galleri
THANDIWE MURIU
4 Universal Truths and selected Camo
ZENZI MDA
Four Portals
TIISETSO CLIFFORD MPHUTHI
Litema
NESA FRÖHLICH
Agapanthus artificialis: Biodiversität im digitalen Raum. Vierteilige Serie, Johannesburg 2022.
STEVEN J FOWLER
2 AI collaborations and 9 asemic scribbles
PATRICIA ANN REPAR
Integrating Healing Arts and Health Care
SHERRY MILNER
Fetus & Host
borborygmus
JANNIKE BERGH
BCUC = BANTU CONTINUA UHURU CONSCIOUSNESS
GWEN ANSELL
Jill Richards: Try, try, try...
VULANE MTHEMBU & HEIKKI SOINI
Nguni Machina remixed
AFRICAN NOISE FOUNDATION
Perennial fashion – noise (After Adorno).
RAJAT NEOGY
Do Magazines Culture?
NDUMISO MDAYI
Biko and the Hegelian dialectic
LEHLOHONOLO MAKHELE
The Big Other
frictions
KHAHLISO MATELA
At Virtue’s Zone
DIANA FERRUS
In memory of “Lily” who will never be nameless again
VUYOKAZI NGEMNTU
Six Poems from the Shadows
SIHLE NTULI
3 Durban Poems
SIBONELO SOLWAZI KA NDLOVU
I’m Writing You A Letter You Will Never read
OMOSEYE BOLAJI
People of the Townships episode 3
claque
SIMON GIKANDI
Introducing Pelong Ya Ka (excerpt)
UNATHI SLASHA
"TO WALK IS TO SEE": Looking Inside the Heart - Sophonia Machabe Mofokeng’s Pelong ya Ka
VANGILE GANTSHO
Ilifa lothando – a Review of Ilifa by Athambile Masola
ZIZIPHO BAM
Barbara Boswell found in The Art of Waiting for Tales
WAMUWI MBAO
Hauntings: the public appearance of what is hidden
CHARL-PIERRE NAUDÉ
Dekonstruksie as gebundelde terrorisme
VUYOKAZI NGEMNTU
Ibuzwa Kwabaphambili - A Review
MPHUTLANE WA BOFELO
Taking radical optimism beyond hope - Amakomiti: Grassroots Democracy in South Africa’s Shack Settlements
PATRIC TARIQ MELLET
WHITE MISCHIEF – Our past (again) filtered through the lens of coloniality: Andrew Smith’s First People – The lost history of the Khoisan
CHANTAL WILLIE-PETERSEN
BHEKI MSELEKU: an infinite source of knowledge to draw from
JEAN MEIRING
SULKE VRIENDE IS SKAARS - a clarion call for the importance of the old and out-of-fashion
GEORGE KING
Kristian Blak String Quartets Neoquartet
ekaya
PAKAMA NCUME
A Conversation with Mantombi Matotiyana 9 April 2019
KYLE SHEPHERD
An Auto-Ethnographic Reflection on Process
PAULA FOURIE
Ghoema
DENIS-CONSTANT MARTIN
The Art of Cape Town Singing: Anwar Gambeno (1949-2022)
ESTHER MARIE PAUW
Something in Return, Act II: The Blavet-Varèse project
STEPHANUS MULLER
Afrikosmos: the keyboard as a Turing machine
MKHULU MNGOMEZULU
Ubizo and Mental Illness: A Personal Reflection
off the record
FRANK MEINTJIES
James Matthews: dissident writer
SABATA-MPHO MOKAE
Platfontein, a place the !Xun and Khwe call home
NEO LEKGOTLA LAGA RAMOUPI
A Culture of Black Consciousness on Robben Island, 1970 - 1980
NELSON MALDONADO-TORRES
Outline of Ten Theses on Coloniality and Decoloniality*
ARYAN KAGANOF
An interview with Don Laka: Monday 10 February 2003
JONATHAN EATO
Recording and Listening to Jazz and Improvised Music in South Africa
MARKO PHIRI
Bulawayo’s movement of Jah People
STEVEN BROWN
Anger and me
feedback
MUSA NGQUNGWANA
15 May 2020
ARYAN KAGANOF / PONE MASHIANGWAKO
Tuesday 21 July 2020, Monday 27 July, 2020
MARIA HELLSTRÖM REIMER
Monday 26 July 2021
SHANNON LANDERS
22 December 2022
FACEBOOK FEEDBACK
Facebook
the selektah
CHRIS ALBERTYN
Lost, unknown and forgotten: 24 classic South African 78rpm discs from 1951-1965.
hotlynx
shopping
contributors
the back page
CHRIS BRINK
Reflections on Transformation at Stellenbosch University
MARK WIGLEY
Discursive versus Immersive: The Museum is the Massage
© 2024
Archive About Contact Africa Open Institute
    • Issue #01
    • Issue #02
    • Issue #03
    • Issue #04
    • Issue #05
    • Issue #06
    • Issue #07
    • Issue #08
    • Issue #09
    • Issue #10
    #08
  • Theme AI in Africa

ALEXANDRA STANG

Artificially Correct? How to combat bias and inequality in language use with AI

Language defines the world and how we see it. It never is neutral, but can be a powerful tool to generate inequality. With the project Artificially Correct, the Goethe-Institut has since the end of 2020 tried to make language use more inclusive by both questioning and using AI. With the aim of developing an AI-based tool together with experts, and by building up a network of translators, activists, developers and scientists, the project wants to contribute to the minimization of biases in texts, and strengthen a conscious approach to language.

As an organization present in 98 countries, language and translation plays a crucial role in everyday working life at the Goethe-Institut. Not only are the websites at least bilingual, but also communication with partners, artists, students, or visitors happens in many different languages. Translation and cooperation with translators and interpreters often is crucial. But while working on topics like identity and anti-racism, we realized that we need to educate both ourselves and our audiences regarding questions such as how we use language, and how we can avoid biases in both our mother tongues and foreign languages. And, based on these questions, how can we identify, tackle and minimize biases in texts, translations and translation tools?

First, we saw the need of awareness raising on the topic of biases in language. In cooperation with the Berlin-based macht.sprache, a project to foster politically sensitive translation between English and German, we published articles about terms related to race, identity, and gender and sexuality that require sensitivity in translation. Additionally, we organized events to provide translators the opportunity to work together on their use of language, since also translators are not, and cannot, be always aware of their biases. They appreciate the help of colleagues and experts to find or create suitable terms in their own languages.

Widely used translation tools are of no help here, either: when language is biased, also translations tools are, since they are trained off data sets that contain human bias.

Artificially correct, therefore, set its focus on AI-based translation tools and the biases their translations generate. But instead of condemning translation tools that are based on systems of artificial intelligence, the project’s aim is to investigate the potential of AI to minimize bias. To this end, we initiated an online-hackathon in 2021 to find solutions that help to make AI-based language tools more inclusive. The results of the hackathon with 12 teams and different expertise from all over the world were stunning. And it was amazing to see how many people are willing to spend precious time to tackle issues of inequality. The two winning projects that were selected by a jury have since then developed their ideas and solutions:

The team of the project DeBiasByUs aims to raise awareness of and create a database for machine translation gender bias by creating a public platform that provides information on the topic, on the impact of biases on society, and on the latest research. On their website, users can add biased sentences and receive either new solutions or add their own unbiased version. Besides the platform, also a plug-in that is linked to the website might be developed in the future.

The second winning team uses a Word2Vec solution with the aim to fight bias, sexism and racism. They have trained a machine learning model that scans texts and highlights words that might be biased. Based on this model, they have created a web portal where users can enter texts in English and see biased words highlighted. The flagged words are the basis for a data set that minimizes bias in translation tools.

Both projects will still be developed further and the results will be updated and linked on the project website.

AI can be a useful tool for inclusion if it is used with the right intention, as many of the hackathon teams and their ideas have shown. Yet, as one of the project leaders I have also realized that the deeper you dive into the topic, the better you see the variety of challenges still ahead. The journey to any kind of “unbiased” machine learning is long, since there are many issues that need to be tackled in the near future. Questions like which data is used for machine learning, or who provides the data are more and more coming to the fore in international discussions and projects. However, most of the data used for machine learning comes from the global North and therefore only represents a small part of the global reality. And while most of the people working with the development of AI still are white men, the sourcing of the raw materials for example happens in poorer countries and under bad working conditions: diversity and equality in AI are a big issue that everyone working in the field needs to be made aware of. Projects like Artificially correct can hopefully contribute to solving this issue, since equality also starts with a conscious and considerate use of language and language tools that are available. If you want to dive deeper into the topic of bias, language and artificial intelligence, you can find the Artificially Correct’s online dossier that compiles different perspectives on how artificial intelligence affects language and text production, and on why discriminatory language must already be addressed in schools.

All images on this page are by EL BOUM/ Goethe-Institut, (Copyright: Goethe-Institut).

Share
Print PDF
SLINDILE MTHEMBU
BAKARY DIARRASSOUBA
© 2024
Archive About Contact Africa Open Institute