Artificial intelligence is quickly becoming the primary governing resource for online platforms, and these platforms are often regulating themselves using these AI technologies. However, because of the fact that these AIs have been developed by human beings, they often inadvertently end up incorporating some of the biases and at times even bigotry of the people that have made them, and recent research has shown that a lot of AIs do have a bigoted slant to them.
A selfie tool that recently went viral called ImageNet Roulette found a lot of popularity on Twitter. This selfie tool was basically an AI that added captions to your picture. These captions would be a summary of the kind of person you are, and they would often be wildly random which is what lead to the AIs viral success all in all.
One thing that was noticed was that a lot of people of color that ended up uploading their pictures onto the platform would often end up receiving captions that were slightly racist in their tone. This contributed even more to the virality of the platform, but at the same time stirred up a fair amount of controversy among users that felt like these racist captions were indicative of the bigotry of the designers.
It turns out that the creators of ImageNet Roulette developed this platform as a kind of social experiment, one that was intended to reveal the biases that can often arise within AI. A lot of people see AI as some kind of objective monolith, something that is becoming dangerous as we hand more and more control of various aspects of our lives to these artificial intelligences.
People should use this as a learning opportunity to discover just how serious of a problem AI bias can be if it is left unchecked for too long a period of time.
Read next: Elon Musk just warned about the swarms of bots on social media platforms and its something we surely need to take a look at
A selfie tool that recently went viral called ImageNet Roulette found a lot of popularity on Twitter. This selfie tool was basically an AI that added captions to your picture. These captions would be a summary of the kind of person you are, and they would often be wildly random which is what lead to the AIs viral success all in all.
One thing that was noticed was that a lot of people of color that ended up uploading their pictures onto the platform would often end up receiving captions that were slightly racist in their tone. This contributed even more to the virality of the platform, but at the same time stirred up a fair amount of controversy among users that felt like these racist captions were indicative of the bigotry of the designers.
It turns out that the creators of ImageNet Roulette developed this platform as a kind of social experiment, one that was intended to reveal the biases that can often arise within AI. A lot of people see AI as some kind of objective monolith, something that is becoming dangerous as we hand more and more control of various aspects of our lives to these artificial intelligences.
People should use this as a learning opportunity to discover just how serious of a problem AI bias can be if it is left unchecked for too long a period of time.
ImageNet is one of the most significant training sets in the history of AI. A major achievement. The labels come from WordNet, the images were scraped from search engines. The 'Person' category was rarely used or talked about. But it's strange, fascinating, and often offensive.— Kate Crawford (@katecrawford) September 16, 2019
Fascinating insight into the classification system and categories used by Stanford and Princeton, in the software that acts as the baseline for most image identification algorithms. pic.twitter.com/QWGvVhMcE4— Stephen Bush (@stephenkb) September 16, 2019
Well, thankfully not pic.twitter.com/NYOpbAhq79— David Meyer (@superglaze) September 17, 2019
How do I say “wash your mouth out with soap” in AI pic.twitter.com/leklS76snm— Shona Ghosh (@shonaghosh) September 17, 2019
Read next: Elon Musk just warned about the swarms of bots on social media platforms and its something we surely need to take a look at