All Tech is Trash

All Tech is Trash
September 20, 2020 LH3_Admin

It all started when Colin Madland @colinmadland started investigating how to keep a colleague’s head from getting virtually amputated when using virtual background on Zoom and finally realized the disturbing reason why it was happening. More on his discovery here.

Along the way, on that thread and many others across Twitter, another disturbing realization was surfacing. As it turns out, when images shared on Twitter include both white and Black people, the platform centered them around the white faces.

Tony “Abolish (Pol)ICE” Arcieri tested this hypothesis with pictures of Mitch McConnell and Barack Obama. See his results.

Dr. Alex Hanna / Kate Silver @alexhanna noted that “Apparently Twitter uses a facial analysis algorithm to center image previews.” Check out this quick robustness test.

Dantley @dantley CDO at Twitter responded that “Facial recognition isn’t used here. Net result feels the same though. Working on fixing it.” He went on to say  that based on some experiments he tried, the model was affected by the contrast with Colin’s skin and that “Our team did test for racial bias before shipping the model.” Read rest of his response here.

His response was reiterated by Zehan Wang @ZehanWang, also at Twitter who said, “We’ll look into this. The algorithm does not do face detection at all (it actually replaced a previous algorithm which did). We conducted some bias studies before release back in 2017. At the time we found that there was no significant bias between ethnicities (or genders).”

Vinay Prabhu @vinayprabhu in continuation of his work with @Abebab [ https://arxiv.org/abs/2006.16923] is systematically studying the nature of racial biases in ‘SoTA’ AI systems and created a twitter account to run the complete experiment and you can follow along at @cropping_bias 

Marco Rogers @polotek shared the heart-breaking deeper issue here. “Twitter decided that they would control how we see images. They built a machine that makes decisions about which faces to show us. And whether intentional or not, that machine produced this. And this is fucking traumatizing to Black people.”

On a separate yet related note…

Erasure of Women on Twitter is Real

Some also pointed out that the platform has been centering male images perfectly while cropping women’s images in ways that often led to awkward and inappropriate pictures of their chest and torso. Prof. Anima Anandkumar @AnimaAnandkumar had pointed out this issue in 2019 but not many were willing to listen then but maybe it’s time they did.

As we highlighted in our article about institutional accountability, we need to stop acting as though these technological outcomes are somehow separate from the environments in which technology is built and take concrete action to address these biases.