Black and Queer AI Groups Say They’ll Spurn Google Funding

Three groups focused on increasing diversity in artificial intelligence say they will no longer take funding from Google. In a joint statement released Monday, Black in AI, Queer in AI, and Widening NLP said they acted to protest Google’s treatment of its former ethical AI team leaders Timnit Gebru and Margaret Mitchell, as well as former recruiter April Christina Curley, a Black queer woman.

“The potential for AI technologies to cause particular harm to members of our communities weighs heavily on our organizations,” the statement reads. “Google’s actions in the last few months have inflicted tremendous harms that have reverberated throughout our entire community. They not only have caused damage but set a dangerous precedent for what type of research, advocacy, and retaliation is permissible in our community.”

In the statement, the groups endorse calls made in March by current and former Google employees for academic conferences to reject Google funding and for policymakers to enact stronger whistleblower protections for AI researchers.

This is the first time in the short history of each of the three organizations that they have turned down funding from a sponsor.

Monday’s announcement marks the latest fallout in response to Google’s treatment of Black people and women and accusations of interference in research papers about AI slated for publication at academic conferences.

In March, organizers of the Fairness, Accountability, and Transparency (FAccT) conference turned down Google funding, and researcher Luke Stark turned down $60,000 in Google funding. Queer in AI organizer Luca Soldaini told WIRED the organization received $20,000 from Google in the past year; Widening NLP received $15,000 from Google.

Cochair Xandra Schofield said Widening NLP, founded in 2017 with a goal of bringing more women into the field, felt a need to sign the joint statement because Google’s actions were inconsistent with the group’s mission of supporting underrepresented researchers. Mitchell was a cofounder of the organization. Widening NLP cochair Haley Lepp added that “by supporting these scholars, we also want to support their research, and their ability to do research that might be critical of the effects of AI.”

Affinity groups like Black in AI, Queer in AI, and Widening NLP are nonprofit organizations formed to protect and represent people who have been historically underrepresented in machine learning community. They operate separate from machine learning conferences but can attract hundreds of attendees to workshops or social events collocated at the most widely attended conferences. In recent years, affinity groups have formed for people with disabilities and for Jews and Muslims.

Queer in AI has also objected to Google Scholar’s approach to trans and non-binary authors who want to update publications after changing their names, Soldaini said.

“We’ve had great to very bad experiences with that, and Google has been on the very bad side,” he said. Name change requests to Google often get no response, he said.

Gebru is a cofounder of Black in AI. The paper in dispute at the time she says she was fired about the dangers large language models pose to marginalized communities was ultimately published identifying her as an author with Black in AI. In a talk last week at the International Conference on Learning Representations, which lists Google as a platinum sponsor, Gebru encouraged academics to refuse to review papers submitted to machine learning conferences that were edited by lawyers.

“Academics should not hedge their bets but take a stand,” Gebru said. “This is not about intentions. It’s about power, and multinational corporations need to be held accountable.”

Black in AI cofounder Rediet Abebe, who will become the first Black faculty member at University of California Berkeley’s department of electrical engineering and computer science, committed last year to not taking money from Google to diminish the company’s sway over AI research.

Source

Author: showrunner