AI Ethics Weekly – June 8: Failure of Leadership

AI Ethics Weekly – June 8: Failure of Leadership
June 7, 2020 LH3_Admin

“The revolution will not be diversity and inclusion trainings”
Lisa Ko @iamlisako

Diversity and Inclusion initiatives are coming under fire because marginalized communities are being co-opted to build technologies that uphold structures of injustice and continue to perpetuate inequality. As a woman of color, I was horrified when my work on diversity and ethics in AI was appropriated without consent to promote an all-white-female panel at a tech conference. “100 Brilliant Women in AI ethics” is an annual list published by the Women in AI Ethics™ initiative that I founded to recognize talented and underrepresented women in this space because you can’t have ethical AI without diversity.

via GIPHY

Start off your week right with our AI Ethics Weekly newsletter. Sign up to get the full-length version of this content delivered to your inbox every Monday!

Diversity does NOT mean replacing all-white-male panels with all-white-female panels. Research shows that all-male lineups don’t happen by coincidence, despite what conference organizers might claim, and it’s the same with all-white panels. It’s critical that we are intentional in our championing of representation and recognition of traditionally marginalized voices in AI/tech. It is on each one of us to speak up when we see something that’s exclusionary of the voices of Black community and people of color or else we are all complicit in allowing the status quo to persist.

Stay safe and healthy! <3

~Mia Shah-Dand @MiaD

PS: If you want to support Diversity & Ethics in AI, you can now fund our work directly at this link.

Racism, Sexism, and Toxic feminism

https://media.giphy.com/media/hpKkJSlZqUCauAImx9/giphy.gif

Of course technology perpetuates racism. It was designed that way.
h/t Mike Ananny @ananny
“We often call on technology to help solve problems. But when society defines, frames, and represents people of color as “the problem,” those solutions often do more harm than good.”

Amazon “Stands in Solidarity” but Sells Racist Tech to Police
h/t Liz O’Sullivan @lizjosullivan
“In their rush to appear sympathetic to the rough contours of social justice — while keeping their legal, public relations, and social media teams in agreement — some companies seem to be forgetting what it is they actually do.”

Microsoft researchers say NLP bias studies must consider role of social hierarchies like racism
“A team of AI researchers wants the NLP bias research community to more closely examine and explore relationships between language, power, and social hierarchies like racism in their work. That’s one of three major recommendations for NLP bias researchers a recent study makes.”

Racial Profiling Via Nextdoor.com
h/t Andrew Rosenblum @rosenblumandrew
“White Oakland residents are increasingly using the popular social networking site to report “suspicious activity” about their Black neighbors — and families of color fear the consequences could be fatal.”

“The New Jim Code” – Ruha Benjamin on racial discrimination by algorithm
h/t Máirín Murray #FanSlán @mairinmurray
“The Princeton sociologist and author of Race After Technology on how new technologies encode old forms of segregation – and how we might build something better.” Read More.

If you liked what you read, Sign up to get the full-length version of this content delivered to your inbox every Monday!

Gender Bias In Predictive Algorithms: How Applied AI Research Can Help Us Build A More Equitable Future
h/t Rachel Payne @rachelpayne
“While Benevolent Sexism may not appear to be as harmful as Hostile Sexism, it is insidious and can be quite damaging to both sexes and to gender equality overall. A comment that might seem positive on the surface might be reinforcing false stereotypes about women.“

‘Hey Beeb’: new BBC digital assistant gets northern male accent
h/t Rachel Coldicutt @rachelcoldicutt
“A Unesco report last year claimed that the often submissive and flirty responses offered by female-voiced digital assistants to many queries – including abusive ones – reinforced ideas of women as subservient.”

Maybe Sheryl Sandberg Should Be Leaning Out
h/t Cathy O’Neil @mathbabedotorg
“Leaning in” seems premised on the idea that, if women can simply buy into the sanctity of the profit motive, they will be amply rewarded in time. It leaves out important things like having genuine human reactions to bad ideas, overruling idiots and being moral.”

Making the Case for Human-Centric AI

AI is transforming society. Here’s what we can do to make sure it prioritizes human needs.
h/t Ian Moura @more_ian_moura
“The ubiquity of artificial intelligence — coupled with the lack of clarity regarding where it is and is not in use, and the near impossibility of removing algorithms from one’s life — has ramifications to both individuals and to society as a whole.”

Unlocking the value of values with AI
h/t Shea Brown @sheabrownethics
“The potential of AI is massive, but a successful AI strategy must be considerate of the increasingly privacy-conscious consumer.”

Surveillance Tech by Another Name

Under pressure, UK government releases NHS COVID data deals with big tech
h/t Mia Shah-Dand @MiaD
“Hours before openDemocracy was due to sue, the government released massive data-sharing contracts with Amazon, Microsoft, Google, Faculty and Palantir. The contracts, released to openDemocracy and tech justice firm Foxglove today, reveal details of what has been described as an ‘unprecedented’ transfer of personal health information of millions of NHS users to these private tech firms.”

This startup is using AI to give workers a “productivity score”
h/t MIT Technology Review @techreview
“Is it okay to be spied on by managers because they pay you? Do you owe it to your employer to be as productive as possible, above all else?”

Tech Could Be Used to Track Employees—in the Name of Health
“Now the makers of technology for tracking objects and people are hoping to seize on the opportunity. They are repurposing and repackaging their beacons and software as tools to track workers’ movements inside workplaces.“

If you liked what you read, you can now fund our work directly at this link!