Twitterati discovers biased algorithms are everywhere while tech bros pretend that manipulative tech was unpredictable. This weekend, Theo @psb_dc and Dr. ir Johannes Drooghaag (JD) @DrJDrooghaag observed that Twitter was buzzing with the collective discovery that Zoom and Twitter may be using racially biased algorithms. Read some of the key highlights.
Start off your week right with our AI Ethics Weekly newsletter. Sign up to get the full-length version of this content delivered to your inbox every Monday!
More Diversity in Datasets, you say? Easier Said Than Done.
Biased datasets are proliferating, often created by the same elite institutions and experts who vocally advocate against bias in AI. Whether or not unintentional, these biased datasets have far-reaching consequences as they are used to train AI systems used in the real world, including but not limited to facial recognition tech.
MIT Takes Down Popular AI Dataset Due to Racist, Misogynistic Content
“According to the creators, they directly copied over 53,000 nouns from WordNet, and then automatically downloaded images that corresponded to those nouns from various search engines. Except WordNet contains derogatory terms, and so you end up with results that inadvertently confirm and reinforce stereotypes and harmful biases.” Read more.
ImageNet, a popular online database of images will remove 600,000 pictures after an art project revealed the system’s racist bias.
“Workers at Amazon Mechanical Turk, a marketplace that allows businesses to outsource labor for extremely low wages, then categorized the photos. By defining photos with terms ranging from “cheerleader” to “failure, loser, non-starter, unsuccessful person,” misogynistic, racial, and other biases were formalized in the system. ImageNet was frequently used to train AI systems and judge their accuracy, meaning the implications of these categorizations go far beyond the database itself.” Read more.
Did you like what you read? Sign up to get the full-length version of this content delivered to your inbox every Monday!