Google AI ethics co-lead Timnit Gebru says she was fired over an email

Timnit Gebru, one of the best-known AI researchers today and co-lead of an AI ethics team at Google, no longer works at the company. Details are still being gathered, but according to Gebru, she was fired Wednesday for sending an email to “non-management employees that is inconsistent with the expectations of a Google manager.” VentureBeat reached out to Gebru, a Google spokesperson, and Google AI chief Jeff Dean for comment. This story will be updated if we hear back.

In a tweet published days before leaving Google, Gebru questioned whether there’s regulation that protects people from divulging whistleblower law protection for members of the AI ethics community.

Gebru leaves Google the same day that the National Labor Review Board (NLRB) filed complaints against Google that found that the company spied on and illegally fired two employees involved in labor organizing.

Tawana Petty is a data justice and privacy advocate in Detroit who this week was named national organizing director of Data for Black Lives. This morning she gave a talk about the legacy of surveillance of Black communities with tech like facial recognition and the toxicity of white supremacy on people’s lives. She dedicated her keynote talk at the 100 Brilliant Women in AI Ethics conference to Timnit Gebru.

“She was terminated for what we all aspire to do and be,” Petty said.

Mia Shah-Dand, who organized the conference, called Gebru’s dismissal a reflection of toxic culture in tech and a sign that women, particularly Black women, need support.

Timnit Gebru is known for some of the most influential work in algorithmic fairness research and combating algorithmic bias with the potential to automate oppression. Gebru is a cofounder of the Fairness, Accountability, and Transparency (FAccT) conference and Black in AI, a group that hosts community gatherings and mentors young people of African descent. Black in AI holds its annual workshop Monday at NeurIPS, the largest AI research conference in the world.

Before coming to Google, Gebru joined Algorithmic Justice League founder Joy Buolamwini and created the Gender Shades project to assess the performance of facial recognition systems from major vendors like IBM and Microsoft. That work concluded that facial recognition tends to work best on white men and worst for women with a dark skin tone. That research and subsequent work by Buolamwini and Deborah Raji in 2019 have been highly influential among lawmakers deciding how to regulate the technology and people’s attitudes about the threat posed by algorithmic bias.

While working at Microsoft Research, she was lead author of “Datasheets for Datasets,” a paper that recommends including a set of standard information with datasets in order to provide data scientists with context before they decide to use that data for training an AI model. “Datasheets for Datasheets” would later act as motivation for the creation of model cards.

As a Google employee, Gebru joined Mitchell, Raji, and others in writing a paper in 2019 about model cards, a framework for providing benchmark performance information about a model for machine learning practitioners to evaluate before using an AI model. Google Cloud began providing model cards for some of its AI last year, and this summer introduced the Model Card Toolkit for developers to make their own model cards.

This summer, Gebru and her former colleague Emily Denton led a tutorial about fairness and ethics in computer vision at the Computer Vision and Pattern Recognition (CVPR), and shortly after got into a public spat with Facebook director of AI research Yann LeCun about AI bias.

Member of the AI community and others have at times have referred to Gebru as someone actively trying to save the world. Earlier this year, Gebru led a was included in the book “Good Night Stories for Rebel Girls: 100 Immigrant Women Who Changed the World,” a book that was released in October.

By VentureBeat Source Link

LEAVE A REPLY

Please enter your comment!
Please enter your name here