There is a pretty big problem in the world, and it basically has to do with how people perceive those that might from a different race. Tech companies have long been accused to uplifting these stereotypes and making it difficult for anyone to move past them, and the manner in which these stereotypes might be upheld by various tech enterprises can often be a great deal more subtle than you would have initially ended up expecting.
One new feature that Apple recently added was a search bar that you could use to find the emojis that you would want to add to your iMessage conversations. While this search bar most definitely enabled you to find a way to maximize the kind of efficiency with which you can use emojis, the algorithm has been found to uphold some pretty racist connotations, as reported by RestofWorld. For example, if you were to type the word “Africa” into the search bar, one of the responses that you would get would be what looks like a makeshift hut. Searching for China can show you the emoji of a dog which ties into a lot of the stereotypes that people associate with people of Chinese origin.
It is possible that this was a mistake on the part of the programmers and that the algorithm just needs to be trained a bit more, but it does reveal that you do not need to have racist intentions to create something that accidentally facilitates racist values. This can perpetuate the negative stereotypes that people belonging to these ethnic groups are trying to do away with, and it makes it even more difficult for the world to become a truly balanced place. Apple really needs to do something about this otherwise the negative consequences would be very widespread especially when you consider how popular iPhones truly are.
Read next: Apple’s Policy for Supporting Old iPhones Revealed In One Chart
One new feature that Apple recently added was a search bar that you could use to find the emojis that you would want to add to your iMessage conversations. While this search bar most definitely enabled you to find a way to maximize the kind of efficiency with which you can use emojis, the algorithm has been found to uphold some pretty racist connotations, as reported by RestofWorld. For example, if you were to type the word “Africa” into the search bar, one of the responses that you would get would be what looks like a makeshift hut. Searching for China can show you the emoji of a dog which ties into a lot of the stereotypes that people associate with people of Chinese origin.
It is possible that this was a mistake on the part of the programmers and that the algorithm just needs to be trained a bit more, but it does reveal that you do not need to have racist intentions to create something that accidentally facilitates racist values. This can perpetuate the negative stereotypes that people belonging to these ethnic groups are trying to do away with, and it makes it even more difficult for the world to become a truly balanced place. Apple really needs to do something about this otherwise the negative consequences would be very widespread especially when you consider how popular iPhones truly are.
Read next: Apple’s Policy for Supporting Old iPhones Revealed In One Chart