logo

Brandwatch Bulletin 152: The Fight for Inclusive Tech

Not the usual data story but a data scientist story.

17 February 2023

From safety equipment and crash-test dummies modeled after the “average” man to algorithms that unfairly punish marginalized groups, the way the world is designed puts so many people at risk and disadvantage.

When models are built around one group of people, the experiences of other groups can be severely impacted. The trouble is, these models are shaping nearly all facets of our lives – think cybersecurity, AI, data privacy – and are being designed with the “average”, often white, man in mind.

Today, we dedicate this bulletin to three incredible data scientists committed to algorithmic fairness and ethics in the tech world.

Let’s get to it.

Subscribe to the Brandwatch Bulletin

Timnit Gebru

AI systems, like language models, are based on historical data coming from databases that often contain biases. The model will then reproduce these biases as it learns, which can create huge problems.

One person who’s done great work here is Timnit Gebru, an AI researcher specializing in algorithmic bias and data mining. She is the founder of the Distributed Artificial Intelligence Research Institute (DAIR), a community-based space for AI research.

She advocates for greater diversity in technology, and has shared how the underrepresentation of Black people in the industry leads to discriminatory technology with gender and racial biases. In this TED Talk, she offers insight into how AI technology can discriminate against communities and what it will take for Black people to be better represented.

Dr. Joy Buolamwini

It started as an art project and a dream of becoming Serena Williams.

As a grad student at MIT, Joy Buolamwini wanted to make a mirror that would project a digital image of one of her heroes onto her face. But for some reason, it didn’t work when Joy – who is Black – looked in the mirror…not unless she wore a white mask.

This was the experience of a young Dr. Joy Buolamwini, whose research has started a movement that is now challenging the tech world. Joy discovered the bias baked into artificial intelligence systems because they are trained on datasets of mostly white men’s faces: it can recognize caucasian males, but not so easily faces like hers.

Her research and discoveries led to a greater awareness of bias in AI technology and a demand to address it. In this short TED Radio Hour Comic, you can find out more about her fight against algorithm bias.

Realizing that the problem went beyond art projects, Joy founded the organization Algorithmic Justice League, which is dedicated to educating people about AI bias and how to reduce it. In the documentary Coded Biases, she exposes the biases in AI technology, particularly facial recognition technology.

Jordan Harrod

“I’m Jordan, and I like to look at all the different ways that we interact with artificial intelligence on a daily basis.” This is the message you’re met with visiting Jordan Harrod’s YouTube channel.

Jordan Harrod is a Ph.D. candidate in medical engineering and medical physics in the Harvard-MIT Health Sciences and Technology Program, where she studies machine learning and neuro-engineering.

She uses her YouTube channel to talk about emerging technologies and, among other things, issues relating to fairness, bias, or racism in artificial intelligence or algorithms more generally.

In this video, she discusses the ethics of AI art using Lensa’s viral AI as an example.

While the idea is that you’d upload your headshot and get an illustrative rendering in return, Jordan points out that for people of color the avatars just look like “generic light-skinned black people” and made her “paler than she is.”

In short, as Jordan sums up, if you were planning on spending money on those AI avatar images…don’t. Spend your money on something that’s going to make you happier.

This is just one of the cans of worms Jordan talks about on her YouTube channel. She also likes to playfully point out the limitations of current AI technology. In one funny video, she tries to make crafts using instructions for 5-Minute Crafts generated by Chat GPT.

Recommendations to nourish your mind

Throughout this bulletin, we’ve been linking to articles, YouTube videos, books, comics, and research papers. We recommend adding them all to your reading and watching list.

What should we cover next?

Is there a topic, trend, or industry you’d like us to feature in the Brandwatch Bulletin? We want to hear your ideas to make sure our readers are getting what they want. We may even ask to interview you if you’re involved with the topic.

Send any and all ideas to [email protected] and let’s talk.

Thanks for reading

If you were forwarded today’s bulletin and want to get them yourself, you can subscribe to the Brandwatch Bulletin here.

See you next time,

The Brandwatch Bulletin team

logo
  Our Suite     Use Cases     Industries     Get started  

Runtime Collective Limited (trading as Brandwatch). English company number 3898053
New York | Boston | Chicago | Austin | Toronto | Brighton | London | Copenhagen | Berlin | Stuttgart | Frankfurt | Paris | Madrid | Budapest | Sofia | Chennai | Singapore | Sydney | Melbourne

Privacy Policy

Update subscription preferences

Unsubscribe

We value your privacy

We use cookies and similar technologies to personalize ads and content (including by sharing data with Google), to measure site performance, and to improve your experience. Learn more in our cookie policy

Privacy & Safety • Terms of Service

No, take me to settings
Yes, I agree
More info.

By using our site you agree to our use of cookies —