Future Tense: AI from the margins

Description (in English)

In the video Nakeema Stefflbauer describes the experience exclusion, mis-classification by AI technologies. How it feels to live everyday life with technologies that discriminate when you do not fit the norm. As Nakeema speaks and we can see the video of her on the desktop multiple other windows are opened illustrating examples of discriminating and oppressive AI. Including machine vision examples such as the google algorithm recognizing people of color as gorillas, smart cameras failing to recognizes that East Asian eyes  are open in photos asking "Did someone blink?", facial detection failing to recognize faces of black people, or emotion detection classifying black basketball players having more often negative emotions like anger than white players even when they were smiling. The video also brings forth examples from analog technologies and how film for cameras was calibrated to work best of fair skin.

The video portrays AI in every day life as a "grossly unfair system" (1) exemplifying how AI is, "as seen from the margins." (1) 

 

(1) https://nushinyazdani.com/Future-Tense-AI-from-the-margins

Situation machine vision is used in

Authored by

UUID
570b3a10-fc71-4a53-8168-fc0b9e0843a3