Dr. Joy Buolamwini is a computer scientist and a poet of code who uses art and research to illuminate the social implications of artificial intelligence. She joins to discuss her career as the founder of the Algorithmic Justice League, her best-selling book ‘Unmasking AI: My Mission to Protect What is Human in a World of Machines,’ and her featured role in the acclaimed Netflix documentary ‘Coded Bias.’
Dr. Joy Buolamwini:
It wasn’t as easy as saying, “Okay. Let’s make more inclusive data sets, and when we have more inclusive data sets, we’ll have more accurate facial recognition.” But accurate systems can be abused, and so the analysis had to be not just how well does the technology work, but what kind of technologies do we want in society in the first place?
Curtis Fox:
From the TED Audio Collective, this is Design Matters with Debbie Millman. On Design Matters, Debbie talks with some of the most creative people in the world about what they do, how they got to be who they are, and what they’re thinking about and working on. On this episode, computer scientist and digital activist, Joy Buolamwini, talks about her career and about facial recognition technology in the airport.
Dr. Joy Buolamwini:
You actually have the right to opt out, but most people don’t know and it’s not surprising because you go there and they say, “Step up to the camera.”
Debbie Millman:
Dr. Joy Buolamwini is a computer scientist and a poet of code who uses art and research to illuminate the social implications of artificial intelligence. She founded the Algorithmic Justice League to create a world with more equitable and accountable technology. Her MIT thesis methodology uncovered large racial and gender bias in the world’s largest and most powerful technology companies. Dr. Joy’s journey is depicted in the critically acclaimed documentary Coded Bias of which she’s now on a five-year anniversary world tour, and the documentary sheds light on threats artificial intelligence poses to civil rights and democracy. She’s also the author of the bestselling book, Unmasking AI: My Mission to Protect What Is Human in a World of Machines. Dr. Joy Buolamwini, welcome to this very special live episode of Design Matters at the WBUR Festival.
Dr. Joy Buolamwini:
Sometimes dreams come true and I’m living one right now, so I’m so happy to be here.
Debbie Millman:
Thank you. Thank you. Let’s begin with Baby Joy.
Dr. Joy Buolamwini:
Baby Joy. Okay.
Debbie Millman:
You were born in Edmonton, Alberta, just as your father was finishing his PhD, and you’ve described yourself as a daughter of art and science. What do you remember most from that unique collision of your mother’s paints and your father’s pipettes?
Dr. Joy Buolamwini:
That’s a great question. In the book, I say how my mother asked questions of colors, my dad asked questions of cells, and in that exploration I started asking questions of computers. So for me, I literally grew up with art and science as companions through my parents. In Oxford, Mississippi, that meant going to my dad’s lab and feeding cancer cells, looking at squiggles on computers, learning later these were graphs and so forth, flow cytometry and that kind of thing. Then my mom, I just thought every weekend you go to art galleries and pitch paintings. I didn’t realize that’s what really was going on. So I grew up with those worlds and it felt very much like an invitation to be creative, whether through scientific inquiry or artistic inquiry or for me, playtime, right? So I think that was a true gift that I see now in the way that I do my work as a poet of code.
Debbie Millman:
You spent your early years in Ghana before moving to Oxford, Mississippi. What were the values that shaped you most in those formative years of in-betweenness?
Dr. Joy Buolamwini:
I think the first thing is my first language being Twi. I have all kinds of accents I could switch often, but when I first came to the United States and I was in Oxford, Mississippi, they actually put me in speech therapy.
Debbie Millman:
Why?
Dr. Joy Buolamwini:
Because they didn’t realize English was my second… They should have asked more questions, I think, and maybe I really did need to be in there, but we can deconstruct that later. I recently returned back to Ghana last year after 30 years. I don’t look that old, but it was three decades since I’d been there, and it was the best homecoming I could have imagined.
Debbie Millman:
In what way?
Dr. Joy Buolamwini:
One of the things that I noticed is all of my relatives were looking at me and I was looking at them, but none of us wanted to be rude or stare, so we would look at each other through reflections and just catch an eye. So it felt like I was time-traveling when I would see my aunts and uncles. I was like, “Oh, that’s what my brother’s going to look like in a few years,” or, “Oh, that thing I thought was just my dad, I see it in all of his siblings.” So that kind of way. Then also connecting with my young cousins. I think they range from about age eight to 26. So I’m on the older end of the cousins within a more immediate family. So it felt like a very warm embrace, long-awaited embrace.
Debbie Millman:
Given you were so young when you left Ghana to go to Mississippi, did your family share stories about Ghanaian excellence or legacy that helped shape your own sense of possibility?
Dr. Joy Buolamwini:
Yes, I’m third generation PhD. My grandfather was a dean of a school of pharmacy, Kwame Nkrumah University Science and Tech. We have some Ghanaians in the audience here as well. They went to good schools too, even if it wasn’t science and tech, which is all solid. So growing up it wasn’t like, “Oh, be excellent or that,” it was just around us as this is our family legacy. There’s this value of education. There’s this curiosity. Got a library card very early, and we used to live within walking distance from the Oxford Public Library. So my brother would roll me there in a wagon, and he introduced me to The Boxcar Children and Hardy Boys and that kind of thing.
Debbie Millman:
I have this vision of even driving you back in the wagon with stacks of books around you.
Dr. Joy Buolamwini:
Well, it was so fun because right now the documentary Coded Bias we’re on world tour and we just did a screening in Oxford, Mississippi at that same public library. On the stage, I used to watch magic shows and puppet shows, and then they were showing Coded Bias. My nieces were there. My brother was there. One of my childhood neighborhood friends was there as well, so kids I grew up playing with and eating honeysuckles and that kind of deal. So it was really a nice full circle moment. Then a friend I had in Oxford, England where I later studied, she had now become a professor at Ole Miss, and she was there with her baby and her dog and her husband. So it was this kind of culmination of many strands, different elements of my life there in Oxford, Mississippi. So a Southern twang had come out just a little bit remind where I came from, but it was all good. It was a good time.
Debbie Millman:
There’s a lot of kismet in your life, and we’ll get to that. As a teenager in Memphis, you were building websites to cover your basketball team’s uniform costs, writing Java games, pole vaulting and skateboarding. What did technology represent to you then as you were forging a path as an athlete?
Dr. Joy Buolamwini:
Yeah. Well, it was a means to cover my dues. So instead of paying for the sporting dues, it was, “Okay. Let me make a social media, a social network for the track team. Let me make a website for the basketball team,” because I’m going to be warming the bench, right? I might as well contribute in some other kind of way. My first website was really for my Latin club.
Debbie Millman:
Of course, it was.
Dr. Joy Buolamwini:
It was part of the National Junior Classic League.
Debbie Millman:
As one does.
Dr. Joy Buolamwini:
I spent so much time on that website. It had animations and I was coding and Flash. I used to always say it was the top 10 in the country, so it probably meant it was number nine, but whatever. So that was a little bit of encouragement. Then I was so fortunate now that I look back at it, at the time when I was in Memphis, Tennessee, I had the opportunity to take three different computer science classes as a high schooler, and part of the reason was because of my teacher, Ms. Jill Connell, and she had this one room kind of schoolhouse where on each three walls, and then her wall was on the main wall, her desk, she would teach the first version of computer science, the first class. You’d do the second one, and then you’d have the AP class. So I circled those walls for three years, and that really gave me a great basis.
I was looking back at some of the data, and I think at the time I took the advanced placement computer science test. Only five kids in the state took it. So that I was in that specific school with a young teacher eager to make it work, even if it meant having to come up with three different lessons in that same period was really, really fortunate. I used to spend the nerd I am, I would spend my lunch period there coding Lego robots, and later when I would come to MIT, I would work in the Lifelong Kindergarten group that helped create and establish the Lego Mindstorms as well. So I always shout out to Jill Connell and follow her Facebook posts and her kids now and all of that.
Debbie Millman:
Well, you just said that you were a tech nerd or a computer nerd.
Dr. Joy Buolamwini:
Definitely.
Debbie Millman:
But you were also a coder, a creative and an athlete.
Dr. Joy Buolamwini:
Oh, yeah.
Debbie Millman:
How did your friend group view you?
Dr. Joy Buolamwini:
I was always in between spaces. So for my parents, as long as I got straight As, I could do what I wanted, so I was like, “Cool. I’ll get the grades and let me try basketball. Let me try track and field.” I didn’t actually want to do cross-country. I was recruited to cross-country, and then I remember running my first mile throwing up and the coach, “I think you got this.” “Do I though?” So that’s how I ended up doing three different sports in high school, but I really fell in love with pole vaulting-
Debbie Millman:
I know.
Dr. Joy Buolamwini:
… out of all most of all of those things.
Debbie Millman:
Didn’t you have aspirations to go to the Olympics as a goal?
Dr. Joy Buolamwini:
I mean, if you don’t do it, you got to do it, right?
Debbie Millman:
Aim high, no pun intended.
Dr. Joy Buolamwini:
I couldn’t have vaulted higher. You don’t start doing a full pole vault, you start in the grass, then the sand, then you move to the pit. So before I’d even cleared anything in the grass, I was already saying Olympics, right? So that just tends to be my mentality to aim high.
Debbie Millman:
You attended the Georgia Institute of Technology for your undergrad degree?
Dr. Joy Buolamwini:
Yellow Jackets.
Debbie Millman:
But you first considered majoring in international affairs and biomedical engineering.
Dr. Joy Buolamwini:
Yes.
Debbie Millman:
You eventually transitioned to computer science. What did you envision doing professionally at that time?
Dr. Joy Buolamwini:
Oh, I mean, there were so many things I wanted to do. My first career aspiration was to be a professional skateboarder.
Debbie Millman:
Did that come before or after pole vaulting?
Dr. Joy Buolamwini:
Before pole vaulting. So I started skateboarding when I was around 11 years old. I watched The Goofy Movie, and then The Goofy Movie, they made skateboarding look so cool. I got the cheapest skateboard you could ever get from Walmart. It had this purple cobra on the back. Later I got a much better skateboard, but that’s how I started. So when we moved from Oxford, Mississippi to Memphis, Tennessee, that summer, I started learning how to skateboard, which meant scraping my ankles a lot, mainly.
Debbie Millman:
So you wanted to be a professional skateboarder, but you were majoring in computer science.
Dr. Joy Buolamwini:
This was when I was 12. Then the reason I decided not to pursue that career path is I was introduced to the gender pay gap. So I was looking at skateboarding competitions and I was looking at what the men got paid and what the women got paid. The women’s prize was like what the 13th place got in terms of first place. I was like, “You know what, I might need try something else.” So that’s when I got off the skateboard path for a little bit. I returned to it later in grad school.
Debbie Millman:
And do you still?
Dr. Joy Buolamwini:
Every so often. These aren’t the right shoes, but if you have the right shoes, I’ll try to pull out a 180 boneless. I can still do that from time to time.
Debbie Millman:
In your third year at Georgia Tech, you were working with a social robot named Simon.
Dr. Joy Buolamwini:
Yes.
Debbie Millman:
Your assignment with Simon was to see if you could have the robot engage in a social interaction with a human. You created a project called Peekaboo Simon. Can you talk about that project?
Dr. Joy Buolamwini:
Yes. So the idea with Peekaboo Simon was to program the robot to do a simple turn-taking game, right? So you cover your face, you uncover your face, you say peekaboo. So that’s what I was doing. The problem is peekaboo doesn’t work if your partner can’t see you, and my robot was not detecting my dark skin face. I’m looking at my code. I’m like, “I think the code is right, so what’s wrong?” I didn’t have that much time, and I had a light-skinned roommate, red hair, green eyes, pale skin. “Oh, you’re perfect.” So I used her to test it and it was working on her. So when it came to do the demo, I just made sure somebody with light skin tested it. So this was around 2011 when I was a junior at Georgia Tech.
Debbie Millman:
Your senior capstone project at Georgia Tech was an experiment with the Carter Center, and you traveled to Ethiopia to do that work. There you piloted a health data system in Ethiopia that eventually reached 17 million people. 17 million people. What type of data system and how did you do that?
Dr. Joy Buolamwini:
Yes. That year, I think I kept going around saying, “I want to make an impact in African nations,” as my little tag line when people would ask, what do you want to do? “I’m Joy. I want to make an impact in African nations.” “Oh, that’s cute.” So I was doing that and there was an invitation at Georgia Tech to come to some… I think it was a tech and health summit at 8:00 a.m. I’m like, “8:00 a.m., I don’t know about this,” right? But at that year, I had my show up, speak up, stand up. It was just like my little mantra for the year. So I said, “Okay. I’m going to show up.” At 8:00 a.m., I show up. I’m going to speak up.
I speak up. I talk about my little African nations thing, and it turned out there was an epidemiologist from the Carter Center, and they were looking for a software engineer to help them with some of their global health programs. In particular, they did a program called MALTRA in Ethiopia, and so that was their malaria and trachoma program. They’d been doing it for a few years, and now it was time for monitoring and evaluation, so they wanted to go in and take surveys and things like that. The problem is their paper-based surveys had some limitations, especially when you’re trying to put in GPS coordinates. So if you get one of those digits wrong, you might be in a lake instead of the space you’re supposed to be in.
At that time, open source was very helpful because Android had been released and you now had these Android tablets we could program. So the opportunity was to replace their paper-based way of assessing the effectiveness of these programs into a tablet, into this digital form. So that’s where I came in with my computer science skills. So I was like, “Hmm, surveys.” You could view it that way, or we’re transforming data collection so we can have real-time insights on the trachoma program at the time. So that’s what ended up becoming my senior capstone. We looked at their current system, transformed it to something that had taken about 30 to 90 days we could do in one day.
So that’s why they wanted to continue to adopt it, but I realized global health wasn’t for me because while I was in the field, I was making up songs about pit latrines and hygiene and things like that and that just was not the culture of the global health people at all. I might need to try something else. But yeah, that was one of the formative experiences for me because at that time, I was at Georgia Tech and I was gaining all of these technical skills, and yes, I was working on robots and so forth, but I also knew those robots weren’t going to be in people’s lives in a major way anytime soon. It might be sooner now, but then a decade ago it wasn’t happening, and this desire to make an impact with the tech skills I was gaining at Georgia Tech. That’s why that year I kept saying, “Okay. I’m going to speak up, show up-
Debbie Millman:
Stand up.
Dr. Joy Buolamwini:
… stand up, all of that,” so stood up in Ethiopia.
Debbie Millman:
You then applied for and were awarded a Fulbright fellowship, which took you to Lusaka, Zambia. There you taught young Zambians to make mobile apps.
Dr. Joy Buolamwini:
Yes.
Debbie Millman:
You partnered with iSchool Zambia and worked on the ZEduPads.
Dr. Joy Buolamwini:
Yeah, the ZEduPads.
Debbie Millman:
Did I pronounce that correct?
Dr. Joy Buolamwini:
Yeah. They have been-
Debbie Millman:
Tell us about that.
Dr. Joy Buolamwini:
Yeah. After the experience in Ethiopia, I started asking some questions. Sure, I came in with this technology that I had actually coded most of it in my childhood bedroom in Memphis, Tennessee, so it was during summer break, and because of that, I made all of these assumptions about the technology and the context that weren’t true, so I was-
Debbie Millman:
For what, for example?
Dr. Joy Buolamwini:
For example, I assumed that the internet speeds would be comparable. One of the features of that data collection process was you could upload it to the cloud and then the people, wherever they needed to be, could get the data. But because it was very slow, we ended up realizing we needed to download the data actually on SD cards. Here I am under mosquito net in Ethiopia changing the code because we hadn’t accounted for the difference in internet speeds and things like that.
Debbie Millman:
Is it true that the students thought that you weren’t an instructor, they never thought of you as an instructor?
Dr. Joy Buolamwini:
They still don’t think I’m an… I take it as a compliment now, right? Given that had happened in Ethiopia, when I had the opportunity to do a Fulbright Fellowship, I was thinking, “Okay. It’s one thing for me to come in. Yes, I am of the continent, but I’m also in some ways a Westerner kind of parachuting in. What would it look like to actually equip people to create the systems, the tools that address their own local context?” So that’s where the Zamrise initiative came to be, and that was my Fulbright project.
Then, along the way, we met local skateboarders and poets. So there are a lot of side projects. I also couldn’t be directly paid due to Fulbright roles, so we would barter certain things. So with the founder of iSchool Zambia, I bartered a house for the time I was in Zambia to then do training and be compensated for the apps that the students made through the training, and then they provided office space for that. So it was this whole creative bartering because of the constraint of not being able to be directly paid. We did an Indiegogo campaign and raise money to get the laptops for the students to learn how to code, because I asked-
Debbie Millman:
What’s an Indiegogo campaign?
Dr. Joy Buolamwini:
Oh, does anyone… Good question. So it was an alternative to Kickstarter, so crowdfunding platform, and I wanted to raise… I think we need a $10,000 or something like that, and so that allowed us to get enough laptops to then work with the students. What I had noticed with the ZEduPads when I was talking to the founder of iSchool Zambia, I said, “Well, who developed the software? Who developed the apps?” They’re like, “Oh, well, we used Eastern Europeans.” I asked, “Why?” They’re like, “Oh, we’re not able to find the talent we need here.” So I asked, “Okay. If I did a program to train, would we then be able to have apps on the ZEduPad?” So that’s what the Zamrise program ended up doing, so that when it actually came out, they could say it was also built by Zambians as well.
Debbie Millman:
So you’ve talked about the poetry scene and the skateboarding community you found in Lusaka. What was the connection for you between creation, culture and code?
Dr. Joy Buolamwini:
For me, it was like a way of life, and even now I ask myself, “How can I live poetically and how can I be expressive?” So while I was living in Zambia and I was making friends and so forth, I was like, “Okay. Yeah. What do people do for fun? Where do you go?” I remember going to slam poetry session about subsidies. It was really interesting for me just to see how people were exploring their creativity and things like that. None of my friends were surprised when they saw the skateboard shots while I was doing the Fulbright and we are also delivering for iSchool Zambia and things like that. So I never have felt that I’ve had to separate those worlds. You can have the impact. You can do it with a little flair, hopefully a little bit of style, and connect with the artistic communities that are there, and skateboarding is another type of art.
Debbie Millman:
Did you teach anyone in Zambia to pole vault?
Dr. Joy Buolamwini:
No. See, pole vaulting requires some equipment. You need the pit. You need the poles. The poles are very expensive. Even to get skateboards I learned was tricky. So they didn’t actually have a skate park at the time, and I was asking them where they were getting their supplies from and they were getting it from South Africa.
Debbie Millman:
You then went to Oxford where you made history by proposing the first Rhodes Service Year. What gave you the audacity to reimagine what the Rhodes experience could be?
Dr. Joy Buolamwini:
All the work I did in Ethiopia meant that I had the qualifications to get into a global health program, so I applied and the Rhodes Scholarship covers two to three years. So I’d used my first year to do a Master’s in Learning and Technology. The second year, I wasn’t exactly sure what I wanted to do, so I applied to global health, and I got in and I realized I really don’t want to do globe… I told you, the culture, my pit latrine wraps were not hitting. It had not improved. The situation had not improved since then.
I wrote a forty-page proposal for this year of service and shared the type of visa I would need to have. I really went in and how it would build on the work I had done in Zambia and so forth. Basically, when I sent to the Rhodes Trust, they said, “Don’t get your hopes up, kid. We don’t change much here.” I was like, “Okay. I’m going to do it regardless, and you guys should get the credit right.” Anyways, they decided to agree to it. So that became the first official Rhodes scholar year of service, but many senior scholars told me they’d been doing years of service unofficially well before me.
Debbie Millman:
Had there been official years of service that have come after you based on that change?
Dr. Joy Buolamwini:
There have been actually, and they tended to be other scholars who are really entrepreneurial. I remember talking to some of the members of the trust and they were curious why not as many people had applied for it afterwards. I was thinking, I mean, the profile of Rhodes scholar tends to be those who have mastered the academic game but aren’t taking many risk. So I’m not so surprised that if you had the option to get an MPP or an MBA from Oxford pretty much guaranteed or do a year of service, most people would go for the guaranteed option. So it tended to be the crazy entrepreneurs who were open to that kind of risk anyways, who were doing it at the time.
Debbie Millman:
What did you do during that service year?
Dr. Joy Buolamwini:
Oh, so that service year, I started something called Code4Rights. So Code4Rights was thinking through the curriculum that had been created in Zambia and actually adapting it to an Oxford context and looking at what kind of problems or issues might we apply that process to. We ended up focusing on sexual assault on campus and creating this first responders app as well so that survivors would know what their options were.
Debbie Millman:
Thank you for doing that work. It’s an area that I have particular interest in. You were then awarded a research assistantship at the MIT Media Lab from Ethan Zuckerman, the director of the Center for Civic Media. Now, is it true that he told you that if what you’re thinking of making already exists go elsewhere?
Dr. Joy Buolamwini:
Oh, so that was one of the Media Lab professors. That wasn’t Ethan. Our group was everybody comes sort of group. We had a big wide open table, and at Thursday when we met, we would always have random guests and some regulars, retirees where they would come to our table as well. Ethan was very open, but the summer before I was in the area for my friend’s wedding and I thought, “Let me go check out the Media Lab in case I apply there.” So I met another professor who was like, “Yeah. If what you’re thinking of already exists, this isn’t the place for you.”
What was really interesting to me about Ethan’s group was we were in the Future Factory, but we were a bit of an oddball because we wanted to talk about problems now, the Center for Civic Media. So we were doing things like creating systems that would allow kids to track school lunches to see if the government was delivering on what they had promised, the Promise Tracker app, which was deployed all over Brazil. So it was a group that was a bit out of the mainstream. But then in the time I was there, 2015 to about 2022, there was also quite a bit of change happening in the US where people were saying, “Wait a minute, maybe the research we’re doing, maybe we should be thinking about its immediate real-world impact with the different elections and so forth.”
I remember in 2016, November 2016, after Donald Trump was elected, it was really interesting in my own circles, because I’m from the South, so some people are really rejoicing and other people are devastating, and I’m seeing all of that happening. At the Media Lab, a lot more people were asking questions of, “Okay. If this is happening, should I really be painting walls with my smile as my project or maybe there’s something else I should be doing?” So I remember the day after the election, people were suddenly looking to our group, the oddball group that’s talking about problems now became the mainstream, and that’s around the time I was also starting to explore the work that became the Algorithmic Justice League. So that was a transformation from being at the margins in the Future Factory to being a group that was viewed as a leader within the Future Factory.
Debbie Millman:
You signed up for a course that changed trajectory of your life?
Dr. Joy Buolamwini:
Yes.
Debbie Millman:
Science fabrication. This was specifically focused on building fantastical futuristic technologies. What did you begin to build?
Dr. Joy Buolamwini:
Yes, this is where I started to explore this idea of shape-shifting, right? We mentioned earlier, I’m from Ghana. I was inspired by stories of Anansi, the spider, trickster spider that could also shape-shift. I wanted to do the same, but I had a six-week deadline. So instead of shifting my shape, I decided to shift my reflection in a mirror. So in that process of figuring out how do I make it look like there’s a mask on my face through a mirror, that’s when I started to experiment with computer vision. First, you have to detect the face.
Once I got that going, I started putting different people’s faces on my own. So Serena Williams was one, looked like the greatest of all time. Then I thought, “Okay. This is kind of giving me theme park where you have the cutout hole and you put it through. Wouldn’t it be cool if when I moved it moved just like that?” Anyway, so when I was moving, it’s moving. So to do that, I needed a camera. I put a camera on top of the mirror, and then I needed software that would actually follow my face in the mirror. So I went online, downloaded some software that was supposed to help me do that, but it wasn’t working just like my robot, my peekaboo robot wasn’t working. I was like, “Huh, I think something might be off.”
At the time I was experimenting, it was around Halloween time and I had a white mask for a Halloween party. When it wasn’t working, I started just experimenting with things in my office. I even drew a face on my hand, held it up to the camera, and it detected the face on my hand. So that’s when I was like, “Okay. Anything is possible.” So I reached for the white mask, and before I even had it all the way over my face, it was already detecting the white mask. So here I was at MIT, this epicenter of innovation that I dreamed of coming to since I was a little girl, and I’m in white face to be seen coding in a white mask at MIT. So that was when it switched from, “Can I shape-shift like Anansi, the spider?” to, “Hold up, side quest, what is going on here?”
Debbie Millman:
You’ve said that while the white mask episode was disheartening. You didn’t want people at the time to think you were making everything about race or being ungrateful for rare and hard-won opportunities, and you felt that speaking up had consequences. What kind of consequences did you fear?
Dr. Joy Buolamwini:
Retaliation, of being blacklisted, being marginalized with the way my research was perceived. So I remember even when I started exploring the research, grad students warned me, they’re, “You know X, Y, Z? They try to study bias, didn’t go well.” It’s like these are the bones of grad students past that shadowy place. It looks like studying discrimination. Places the light doesn’t touch for a career, so I was highly discouraged from doing this sort of work because it was touching on bias and discrimination. So that was part of one thread of being discouraged. But another thread of being discouraged was this involves AI. AI involves a lot of math and I’m like, “What?” engineering background, all of that. So the math wasn’t an impediment to me, but others’ perception of the type of work I was capable of, I was also seeing that come out in some of other people’s perceptions. So for me to actually pursue this research was going against all of the wisdom that was being passed down.
My supervisor, he wasn’t against the exploration, but he was really practical. “You spent a year working on a completely different project. This is a two-year program. You want to do a new project halfway through. That might be difficult. Maybe this is a side project.” I think all of the advice was well-meaning, and everything they warned me about did happen, right? So I guess this is where it helps to be a stubborn. But also I think the other thing for me was even though the people around me at the time didn’t quite see the vision, it still felt important to me and I have to commend the Media Lab for creating a space where I could explore a sandbox, even if they didn’t understand the shapes I was building. They’re like, “You do you, right? We don’t know what you’re doing, but figure it out.” There are many spaces that I wouldn’t have even been able to explore that at all.
Debbie Millman:
You stated that in some ways you went into computer science to escape the messiness of the multi-headed isms of racism, sexism, classism and more, and those signs indicated otherwise. You wanted to believe that technology could be apolitical.
Dr. Joy Buolamwini:
Oh, 100%.
Debbie Millman:
What changed your mind about speaking up?
Dr. Joy Buolamwini:
When I saw the reaction to the 2016 election, it was kind of a all hands on deck moment and this questioning of what can I do to make a difference in the world? So when I got to the Future Factory, there were two missions. One, get the PhD, family legacy, third generation. I didn’t really want to go to grad school again by this time, so I did the, “Well, I’ll apply to one place and if it doesn’t work out, I did try.” So I applied to one grad school. It worked out. So this was the price for getting it.
Debbie Millman:
Did you really think you weren’t going to get in?
Dr. Joy Buolamwini:
To the Media Lab?
Debbie Millman:
Yeah.
Dr. Joy Buolamwini:
You never know for sure. I thought I had a good chance, but it wasn’t 100%. If it didn’t work out, I could continue my entrepreneurial dreams. So I was not heavily invested in getting in. In fact, I was going to be working on an entrepreneurial project until I got a call from my father. It’s like, “Remember who you are?” “Oh, no.” I was like, “Rhodes Scholarship, Fulbright fellow. Daddy, this has to count for something, right?” “It is not a terminal degree.” “Okay. Okay. All right. All right. That is true.” So I applied to that one place and got in. Yeah.
Debbie Millman:
You think you’ll ever go back to that entrepreneurial project? It’s a whole other area of questions that I’d love to ask you about, but again, the time is slipping away.
Dr. Joy Buolamwini:
Yeah. I mean, I was working on a hair care technology company. I’m a CTO of an ed tech company. The hair technology company, we were about a decade early on hair analysis, but now if you go to Myavana, we can actually do a hair analysis where you get a camera, it looks at your hair strands. It can tell you so many things about it, but also give you unique product recommendations. So I was a CTO for that before I went to Zambia. So I had a lot of side projects.
Debbie Millman:
Your thesis, which you titled the Gender Shade Project, demonstrates the priorities, preferences, and biases of those who create code and the algorithmic bias from companies including IBM, Amazon, Microsoft, and more. With over 3,400 citations, you exposed how AI facial recognition systems had 100% accuracy for white male faces and near coin flip results for dark-skinned women. Some labeled Michelle Obama as male. Others were labeled as gorillas. What were some of your other findings?
Dr. Joy Buolamwini:
Yeah. There are a whole combination of findings when it comes to the mislabeling of faces. So with the Gender Shades Project, as I started looking at how computers read faces, I was asking three different kinds of questions when it comes to facial recognition technologies. So the first question is, is there a face? This is face detection. When I’m putting on a white mask, that’s a face detection fail, right? Another kind of question I like to say you might ask is what kind of face? So what’s the gender of the face? What’s the age of a face? Maybe what’s the emotional expression on the face? So that’s what I focused on for my master’s work with Gender Shades. I was looking at companies guessing binary gender.
In that, we tested a number of different companies. In the case of Microsoft, it was that there was perfection for one group, the lighter males, the pale males affectionately called and it wasn’t so great for other groups like darker females. But it was also interesting because we tested a company from China and we found that it actually had best performance on darker male faces. This was really important because in that research, we were doing intersectional analysis, borrowing from Kimberlé Crenshaw’s work on discrimination along multiple axes. I was like, “Oh, this is being applied in the legal space, but maybe there’s something for computer scientists to learn.”
So instead of just looking in this case at gender, I also started looking at skin type as well. So it wasn’t just the story of, “Okay. It works better on male faces than female faces,” which was the overall trend, or it works better on lighter faces than darker skin faces. But when we did that subgroup analysis, lighter males, lighter females, darker males, darker females, that’s where we got the stark contrast you were just mentioning. Microsoft was the good results, right? With IBM, their era got between their best performing group, lighter males and their worst performing group, darker females. The highly melanated like myself was around 34%.
Those were the findings that really got me to start exploring what are other ways in which computer vision has failed, right? So you have the gorilla gate example that you just brought up with Google Photos where what I now call an evocative audit, people in the wild, regular people were interacting with these systems and seeing issues. Google fixed the problem not by making a better system, but just by removing any label of gorilla. So gorillas were also not labeled gorillas just to be extra safe. I don’t know if that’s quite the solution. So we’ve seen an evolution of different approaches to addressing some of these misclassifications and mislabeling that comes on. But as I was also doing the work, I realized even if these systems were perfectly accurate, whether we’re going from guessing the gender of a face, which that’s difficult, how does a person identify, to figuring out the unique identity with facial identification.
If you have perfect facial identification, and cameras everywhere, we have the infrastructure for a surveillance state that tracks your every move where you go to worship, who you see at night or in the morning, whatever your preferences are, where you go and protest. So that was very interesting tension for me to hold as I was doing this research because it wasn’t as easy as saying, “Okay. Let’s make more inclusive data sets, and when we have more inclusive data sets, we’ll have more accurate facial recognition.” But accurate systems can be abused, and so the analysis had to be not just how well does the technology work, but what kind of technologies do we want in society in the first place?
Debbie Millman:
Amazon dismissed your findings. How did you find the confidence to stand by your data?
Dr. Joy Buolamwini:
It was interesting because we were talking about my parents earlier, right? We did our first research paper, Gender Shades, and when it came out, we had tested IBM, Microsoft, Face++, this company in China. Our second paper tested those companies, but also included Amazon and another company. When the first paper came out, it was actually fairly well received like, “Okay. Bias is an issue. We’re working on it,” or some companies like, “We’ve been known and we’ve been working.” I was like, “Okay. Okay. But this was the approach.”
When the second paper came out, the reaction was surprising to me and to my research team because imagine having a test where the answers have been known for a year and still failing, and you’re a company as big as Amazon and you are going for a billion-dollar contract with the Pentagon, which was happening at the time when this research came out. So we’re showing their competitors that are also bidding are performing better than they are on this computer vision task. So I was surprised at the Amazon attack because the pushback I had expected from the first paper didn’t come, but this is what the grad students of the past, the places you shouldn’t go.
Debbie Millman:
The bones
Dr. Joy Buolamwini:
And the bones, right? This is what they had warned of. Oh, yeah, it came down hard. I remember I was spending my birthday in Switzerland, but I was at a conference. I was at the World Economic Forum and the paper was going to be released about a week later. I remember flying from Zurich to Honolulu. I’m in this big red jacket and I come, “Well, okay, I need some Hawaiian shorts.” I had sent the pre-results to all of these reporters and Amazon was claiming they had never seen the paper, even though I had the email evidence. Maybe it hit spam. I don’t know, benefit of the doubt, maybe. So they did all of these delay tactics.
Then when the paper came out, they had a corporate vice president, Dr. Matt Wood, come out and try to discredit the paper, so much so that we had the research community rally around us and say, “This paper is valid and important because the types of issues it addresses we have to all be thinking about as AI becomes more entrenched in our lives. So we don’t get better as a field by ignoring the problems, we have to actually confront it.” So this coming from a Turing Prize winner, our Nobel Prize for computer science and a former principal AI researcher of Amazon really had a major impact. What I found interesting though is when the book Unmasking AI came out, it was a Amazon best books pick after all that time. Even as this was going on, I separate individuals from institutions, there were people inside the company sharing what they were able to in terms of the opposition that was actually inside the company about the use of facial recognition. I am happy to say that all of the US-based companies we audited stopped selling facial recognition to law enforcement.
Debbie Millman:
Thank you. Thank you, and very important work. Let’s talk about the other side of facial recognition that you mentioned and the whole way in which we’re now being monitored. What’s your feeling on all of this?
Dr. Joy Buolamwini:
I think we have to be very careful now. I was just looking at the recent announcement with Sam Altman and Jony Ive and thinking about what future computing devices might look like. If you look at companies like Limitless.ai, Limitless AI, which is a pendant that’s always on, always recording, I’m thinking, “Where’s the consent piece?” Some years ago, Google experimented with Google Glass where you could record and so forth, and people didn’t like it. They’re calling it glass holes and things like that.
I was reading on LinkedIn an experience from Allie Miller who’s this top voice in AI, and she shared how she was backstage having a conversation with somebody and said, “Oh, we said so many good things. I wish we had written it down.” The person said, “Oh, don’t worry, I have an AI recording it right now.” So that level of violation of privacy consent is going to only increase if we don’t push back. One of the ways we’ve been pushing back with the Algorithmic Justice League is our Freedom Flyers campaign. The TSA has been putting facial recognition at the checkpoints in airports. They have plans to expand it to 400 airports and you actually have the right to opt out, but most people don’t know, and it’s not surprising because you go there and they say, “Step up to the camera.”
Debbie Millman:
There’s no signage that says you have the right to opt out.
Dr. Joy Buolamwini:
It’s usually very difficult to see. So we’ve been doing surveys to see if people are actually finding the signs, and oftentimes people don’t even see the signs. Last year, we talked to Department of Homeland Security. We showed our data, and so they committed to the spring actually adding language on the kiosk to tell people to opt out, which is a good step. But we really have to think about the power dynamics. You’re in line. I don’t know if you’re early or late to the airport, but usually I have time pressure. You might not have time pressure, but I have time pressure, right? There’s social pressure, people behind you in the line. There’s financial pressure. I don’t know how much your tickets cost, but the prices seem to keep going up, right?
So you’re doing all of that. You finally made it to the airport. Now they got the scan thing going on and a TSA officer tells you to step up to the camera. They don’t tell you have the right to opt out. You’ve seen everybody else in front of you step up. The signs are there. They’re usually turned around in a different language. We’ve documented all the things. They managed to put the part that says you opt out where the pullout piece goes. It might’ve just been a coincidence.
Debbie Millman:
Oh, yeah, I’m sure.
Dr. Joy Buolamwini:
We’ve been documenting some of these things. So it’s not designed. So you’re actually aware. One thing they know how to do at airports is design signs, right? No weapons formed against TSA shall prosper. It’s a close to $15,000 sign. Real ID, you need to have that. The signs they want you to see, TSA is hiring. You will be able to see those. There’s all of that going on. The big thing we’ve been doing is one, letting people know they have the right to opt out because we are getting the story that, well, no one refuses it. They don’t know they can.
But the other thing we’ve been doing is also collecting travelers experiences, and some of them have been really disheartening. I remember reading one about a son, the son of a man, and he had autism and his son was scanned against his will. I told my parents to opt out because of the work I do and I remember them being humiliated in line, right? When I sat down with Secretary Mayorkas at the time, I shared that story, but the stories of many other people who talked about intimidation tactics or just being dismissed as like, “Well, we have cameras all over the airport anyways.” I just came back from Greece and this was the very experience I had when I was crossing over when I said, “Oh, I want to opt out.” Smug look, they’re like, “We have cameras everywhere. This doesn’t matter.” But this actually does matter, and why it matters is if we don’t opt out and if we don’t resist, then that narrative persists that people want this, which they’ll use.
They claim right now that they only put the facial recognition on some of these checkpoints and that they don’t have it on the overall system, but all of this is just one line of code from being mass surveillance if we don’t actually resist, and so that’s why it’s important, even if you haven’t opted out in the past, that you continue to do it in the future. So then going back to that situation where you’re backstage and someone has their AI device, if we want a consent culture, we have to practice a consent culture and also demand a consent culture because what we will see if that doesn’t happen is listening devices everywhere, which we are already set up to do with our cell phones, but even more ubiquitous and even more intimate.
Debbie Millman:
Aside from not providing consent, what else can we do to combat mass surveillance in our society?
Dr. Joy Buolamwini:
Yeah, I think this is where there’s individual action and collective action that’s needed as well. At the end of the day, it does come down to loss, right? Seeing, for example, the EU AI Act, where they have a restriction on the use of live facial recognition shows that there’s alternative ways. Now, it does depend on the administration. What we saw with facial recognition, we still don’t have federal laws around it, but we were able in 2 19, 20 20 to get quite a bit of traction at the municipal level and also at the state level. What’s going on in Congress right now and now at the Senate is there’s a push to say any kind of regulation of AI at the state level is going to be on pause for a decade. That’s huge. That’s saying the protections that people fought for because at the federal level’s a bit harder, but at the state level, we can get traction, all of that gets rolled away. So that’s what we should be putting our eyes on right now.
Debbie Millman:
Given the current administration, what are your hopes and fears about what is possible from a collective public response?
Dr. Joy Buolamwini:
I think my fears right now, especially with the gutting of people in the administration or people at the IRS, people at different federal agencies is this assumption that humans are so easily replaced. So I think about this is why stories matter so much. It’s not even the power of AI, but the power of the stories we tell about AI. So I think about NEDA, the National Eating Disorder Association, they bought into the hype, right? “AI is going to be better than humans. It can do call center work, et cetera.” At the time, they had their workers were trying to unionize for the reasons you do, better pay, better conditions, all of that. They decided to fire all the workers and replace them with AI. So you had a headline, right? “NEDA fire staff, replaces with AI.” I don’t think it was even a week before you had the next headline, which was the chatbot that the people were replaced with had to be shut down, and that chatbot was actually giving people with eating disorders advice that’s known to make eating disorders worse.
Here we’re looking at the actions of FOMO, fear of missing out. We have to adopt these AI solutions, and also how easy it is to dismiss the work you’re not proximate to to think it’s so easily automated. Now, coming back to the government, we’re seeing this clear out of humans, and then we’re going to see this rollout of AI systems as a replacement to the humans, and then we’re going to find out, “Oh, we did need humans after all,” right? And we’re going to be-
Debbie Millman:
Surprised.
Dr. Joy Buolamwini:
… a backtrack, surprise, surprise.
Debbie Millman:
Tell us about what you’re currently doing because you are on the front lines of helping society manage all of these issues and protecting society from the issues. Talk about what you’re currently doing. I’m going to ask you one last question and then I’m going to have to close out.
Dr. Joy Buolamwini:
Oh, it’s been so much fun. I know. Okay. So what I’m currently doing is going global. I mean, not that the US isn’t a great place to be right now.
Debbie Millman:
Sorry, allergies.
Dr. Joy Buolamwini:
Last year, Oxford University where I did the Rhodes Scholarship, and I did a Master’s in Learning and Technology and didn’t do that Master’s in Global Health, they reached-
Debbie Millman:
The two master’s is enough, right?
Dr. Joy Buolamwini:
Not for my dad. You had me out here working. So they reached out and they started a new institute focused on ethics and AI, and they had a fellows program. So they asked if I’d be an inaugural fellow. I was like, “Oh, what does it entail?” “Well, tell us what you want to do, when you want to do it and what resources you need.” “Is that a blank check? Sound like a blank check. I’ll take the blank check.” So long story short, what ended up happening is I proposed doing a five-year anniversary world tour of the documentary Coded Bias.
Coded Bias premiered at Sundance in 2020. When I watched it in 2025, it was as relevant as ever. We’re talking about issues of bias and discrimination. AI still persistent, and people see it more now because if you are applying for a job, you might not get it, if you’re applying for a loan, all of these areas now AI is becoming a part of the decision-making system. If you’re applying for college, if you get the right medical diagnosis, if the transcript of what happened when you were talking to your doctor is even accurate, all of this is now being mediated by AI systems.
Debbie Millman:
I mean, there’s so much to talk about regarding the current potential situation of the administration looking at social media to allow you into the country or not.
Dr. Joy Buolamwini:
Absolutely. You’ve already had issues in the past where context collapse happens. So someone might say, “She’s the bomb.” “Was that bomb? Was that a bomb threat?” This is real. There are two parts, one, when it’s mistaken and one when it’s intentional as well.
Debbie Millman:
We’ve talked so much about your science, your innovations, the way in which you’re working to protect society. We haven’t talked very much about your poetry. For those that are interested in seeing more about Dr. Joy, you can go to Poetofcode.com. Before we close out the show, you have some really beautiful poetry in your book, Unmasking AI. You’ve done extraordinary performance art. I’m wondering if you can share one poem from your book with our audience today.
Dr. Joy Buolamwini:
Only one? We don’t have time for more?
Debbie Millman:
I’m up if you are. I don’t know about everybody else in the production team.
Dr. Joy Buolamwini:
Okay. I’ll do two if they permit me, because I want to end on a happy note. But before I end on the happy note, this is the note I think we should all hear at the moment. This poem is called “Unstable Desire.” It’s in the epilogue where I talk about my roundtable discussion with President Biden.
“Unstable Desire”
Prompted to competition
Where be the guardrails now?
Threat in sight
Will might make right?
Hallucinations
Taken as prophecy.
Destabilized
On a middling journey
To out pace
To open chase
To claim supremacy
To reign indefinitely
Haste and paste
Control altering deletion
Unstable desire
Remains undefeated
The fate of AI
Still uncompleted
Responding with fear
Responsible AI, beware
Profits do snare
People still dare
To believe our humanity
Is more than neural nets
And transformations of
Collected muses
More than data and errata
More than transactional diffusions
Are we not transcendent beings
Bound in transient forms?
Can this power be guided with care?
Augmenting delight alongside
Economic destitution?
Temporary bandaids cannot
Hold the wind when the task
Ahead is to transform the
Atmosphere of innovation.
The android dreams entice
The nightmare schemes of vice.
Poet of Code, Certified Human Made.
Debbie Millman:
Thank you. You said you’re going to read another one.
Dr. Joy Buolamwini:
So this is one I wrote in the chapter called “Intrepid Poet,” and this is when I start flexing the storytelling muscles because for so long I felt that if I let my artistry out, my research might not be taken as seriously. It was such a relief to find that it was moving from performance metrics to performance arts that really moved this message around the world.
So to the Brooklyn tenants and the ex-coded
Resisting and revealing the lie
That we must accept the surrender of our faces
The harvesting of our data
The plunder of our traces
We celebrate your courage
No silence
No consent
You show the path to algorithmic justice requires a league
A sisterhood
A neighborhood
A podcast
Hallway gatherings
Sharpies and posters
Coalitions, petitions, testimonies, letters
Research and potlucks
Dancing and music
Everyone playing a role to orchestrate change
To the Brooklyn tenants and freedom fighters around the world
Persisting and prevailing against
Algorithms of oppression
Automating inequality
Through weapons of math destruction
We stand with you in gratitude
You demonstrate the people have a voice and a choice.
When defiant melodies harmonize to elevate
Human life, dignity, and rights
The victory is ours.
Debbie Millman:
Dr. Joy Buolamwini, thank you so much for making so much work that matters and thank you for joining me for this very special live episode of Design Matters at the WBUR Festival. To know more about Dr. Joy, you can read her book, Unmasking AI, and see more of what she’s doing on her website Poetofcode.com. This is the 20th year we’ve been podcasting Design Matters, and I’d like to thank you all for listening. And remember, we can talk about making a difference, we can make a difference, or we can do both. I’m Debbie Millman and I look forward to talking with you again soon.
Curtis Fox:
Design Matters is produced by the TED Audio Collective by Curtis Fox Productions. The interviews are usually recorded at the Master’s in Branding Program at the School of Visual Arts in New York City, the first and longest-running branding program in the world. The editor-in-chief of Design Matters Media is Emily Weiland.