In a companywide memo to employees Wednesday, Google CEO Sundar Pichai said he’s sorry about how the company fired prominent AI ethics co-lead Timnit Gebru. He said he accepts “the responsibility of working to restore your trust” and that the company is now considering the implementation of “de-escalation strategies.” He concluded his comments about why Google fired one of the best-known Black female AI researchers in the world by stating that the company treats diversity as a priority.
In response to the memo, Gebru said Pichai’s comments play to the trope that she’s an angry Black woman and called the content “dehumanizing.”
In an interview with VentureBeat on Wednesday, Gebru talked about toxic workplace culture in Big Tech, ethics washing, keeping AI research free from corporate influence, and the two things she wants young women and people of color interested in entering the field of machine learning to know. Since Google fired Gebru, many of her colleagues on the Google AI ethics team, the rest of the company, and the wider tech community have demanded accountability and change. In the past week, more than 2,000 Googlers and over 3,700 supporters in academia and industry have signed a petition supporting Gebru and calling what happened to her “a retaliatory firing,” and a case of “unprecedented research censorship.” Gebru no longer works at Google following a series of emails and a demand that she rescind research about the risks of deploying large language models, particularly for the environment and marginalized communities.
Before she was fired last week, Gebru was co-lead of Google’s AI ethics team and helped assemble one of the most diverse teams within the Google Brain research division. She’s also a cofounder of Black in AI and the Fairness, Accountability, and Transparency (FAccT) AI research conference. Gebru is part of a chorus of voices in the algorithmic fairness research community who say the makers of AI must center marginalized communities in their work and address risks and harms associated with deploying AI models.
This interview has been edited for brevity and clarity.
VentureBeat: What are your thoughts on Sundar’s apparent commitment to investigate what happened?
Gebru: I think that [Google] just saw all the outpour and that they had to say something, and so they’re saying something. But given the gaslighting and the fact that he’s talking about a de-escalation strategy making me sound like an angry Black woman, I feel that this is just a strategy to quell what is happening right now. I just don’t have faith in it.
They paint me as this angry Black woman because they put you in this terrible workplace, and if you speak up about it, then you become a problem, and then they start talking about de-escalation strategies.
You write emails, they get ignored. You write documents, and they get ignored. Then you discuss how it’s being done and then they talk about you as if you’re like some angry Black woman who needs to be contained.
VentureBeat: Are there any particular policy changes that you hope come about as a result of what happened?
Timnit Gebru: Yeah, I think there are a number, but I want to think about this more before I say anything because I want to be intentional and careful when suggesting policy changes.
VentureBeat: How should what happened to you shape how people feel about corporate influence over research?
Gebru: A lot of people have been talking about that. All of these research conferences are heavily funded by industry, so right now what is computer science research? It’s like you’ve got the military and you’ve got corporations. What are our other options? Like sometimes there’s NIH, but there just needs to be stuff that’s not associated with the military or corporate interests that funds research because inherently there is a conflict of interest. I’m not saying that there shouldn’t be research at corporations. I think there should be, but when you have the kind of things like what you’re seeing with me, and especially with the [research] censorship and then you see the types of influence they have in these conferences, I think it’s something that people really need to think about.
The problem is that when you do research that requires a lot of resources, this becomes even more of a problem. That’s a little bit of what we talked about in the paper too.
VentureBeat: One of the first tweets I read after you were fired was by someone who said there’s a lot of young women and people of color who are interested in entering the field of machine learning that are watching this episode happen right now. I’m curious: What do you want them to know?
Gebru: Young women of color, I want them to know that these moments are necessary. I think these moments are really hard, and what’s painful to me is the message that this sends. The biggest story to me is if this is happening to me, what’s happening to other people? A lot of people don’t have the platform and visibility and grassroots support that I have, so just imagine: What’s happening to other Black women? And how are we still imagining that corporations of their own volition with their diversity initiatives are going to do the right thing?
You have harassers going out with millions of dollars. You have all of these people with so much toxic behavior where people say ‘But they’re too valuable to the company.’ ‘Oh, but they’re socially awkward’ or whatever. And you have a Black woman who has had to prove herself over and over again. I’ve finally gotten to a point where my expertise is valued by people but let me tell you: not inside Google. There were many times where my expertise was completely dismissed. I wrote about that in my email.
You have someone whose immediate manager and team members have been taking risks to stand by her publicly. You have someone whose entire community is standing by her publicly because I cofounded Black in AI with Rediet Abebe, and all of that is not enough for them, not only to determine so fast that I’ve got to go, but to do it in the most disrespectful way, and then play into the angry Black woman narrative.
What I want these women to know is that it’s not in your head. It’s not your fault. You are amazing, and do not let the gaslighting stop you. I think with gaslighting the hardest thing is there’s repercussions for speaking up, but there’s also shame. Like a lot of times people feel shame because they feel like they brought it upon themselves somehow.
So I know it’s difficult to feel that you did not bring this upon yourself and you are in the right, but you have to feel that way. And the way to do it is to be around people and to have those grassroots, to have people around you who affirm that. Because if you’re only around people like the VP who just did this to me and gaslit me, you would feel constantly that you brought this upon yourself. That’s the first thing I would say.
The second thing I would say is this is why it’s important for you to participate in our technological future and shape it in your own imagination. Dr. Ramon Amaro said this in our Black in AI workshop. Ruha Benjamin also talks about this.
This is why it’s really important for us to think about what scientific education is doing to us, whose paradigm they’re teaching us, and why the only option we’re given right now is to assimilate into this racist and sexist structure, where if you dare step out of your place then you are pushed out like the most discardable object. It doesn’t matter how much expertise you have. It doesn’t matter how much support you have.
So it’s not only important for you to participate in this technological future, but think about an alternative future where your imagination gets to shape what kind of technology we’re building. Because right now, your imagination is not shaping what kind of technology we’re building. We’re trying to do cleanup after all the white men who put us in this mess. We’re not creating technology in our own imagination. They create technology in their imagination to serve their interest, it harms our communities, and then we have to perform cleanup. Then while we’re performing cleanup, we get retaliated against.
VentureBeat: As it relates to people doing things without consequences, what kind of accountability mechanisms do you think are necessary for leadership at Big Tech companies to address long-standing issues?
Gebru: So you’re in a toxic work environment and what happens? None of the leaders have advanced because they have created a good environment for Black women. Why? Because they don’t have any Black women under them. Zero. So all these leaders or leaders who have advanced got into their positions without having to create a livable, workable environment for people like me, and then who are the people who determine whether the leaders are doing well? The other leaders, who are the people determining whether people like my [former] manager Samy [Bengio] or others get good ratings and calibration etc? It’s those leaders who have already failed to create any sort of livable, breathable environment for people like me.
So how do you expect them to then hold other people accountable when they themselves have created hostile environments for people like me? And you’ve seen what just happened. They just pushed me out as fast as possible. They didn’t even have the time to set their story straight.
They don’t care that … like this corporation doesn’t care about diversity or inclusion. So I’m trying to think of what could be done from the outside because nothing else is working, and in fact the lip service makes it a lot worse because it’s gaslighting.
VentureBeat: Do you think pay for executives at Big Tech companies should be tied to diversity goals so that there’s a stronger incentive than “We’ll do better next time”?
Gebru: I mean, there certainly has to be something better than “We’ll do better next time.” I’m wondering if their pay … I mean, yes, I think their pay should probably be directly tied to it or something real. Something real needs to be directly tied to outcomes.
The other thing I worry about is if this kind of stuff is in policy, like let’s say it’s tied to their pay then there can be a backlash from white supremacist groups about giving Black people a handout and all that stuff.
VentureBeat: A few researchers told me they don’t understand why the paper you wrote attracted such a strong response from Google. How much do you think about what happened had to do with the contents of the paper or something else?
Gebru: What I’m thinking is there are many components to this. Maybe the paper had something to do with it. Maybe the paper was an excuse and they were looking for a reason. I mean we talk about large language models [in the paper], but we’re not specifically talking about Google. Like BERT is used at Google in search and so it could be they’re a bit sensitive to that and want to censor anything coming out of Google that has anything to do with these types of things. But the thing is it’s not just the censorship. It was so disrespectful the way it was done. On top of all of that, they want to censor my work without any sort of discussion.
VentureBeat: What do you want to do next?
Gebru: I’m not sure yet. They literally, they resignated me — I like this word that my team uses — they resignated me in the middle of my vacation, and I said I wanted to come back and discuss this after vacation. So I just want to catch a breath. I don’t want to be in these institutions that are hostile to Black women. I think I tried everything I could to try to make a little livable nugget of these institutions for people like the ethical AI team. I want to make sure that whatever I do next that I’m in a safe environment that doesn’t gaslight me and I don’t have to fight so hard to do something that is important to my community.