Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech
Back in 2015, software engineer Jacky Alciné pointed out that the image recognition algorithms in Google Photos were classifying his black friends as “gorillas.” Google said it was “appalled” at the mistake, apologized to Alciné, and promised to fix the problem.
But, as a
new report from Wired
shows, nearly three years on and Google hasn’t really fixed anything.
The company has simply blocked its image recognition algorithms from
identifying gorillas altogether — preferring, presumably, to limit the
service rather than risk another miscategorization.
Wired says it performed a number of tests on
Google Photos’ algorithm, uploading tens of thousands of pictures of
various primates to the service. Baboons, gibbons, and marmosets were
all correctly identified, but gorillas and chimpanzees were not.
The
publication also found that Google had restricted its AI recognition in
other racial categories. Searching for “black man” or “black woman,” for
example, only returned pictures of people in black and white, sorted by
gender but not race.
A spokesperson for Google confirmed to Wired that
the image categories “gorilla,” “chimp,” “chimpanzee,” and “monkey”
remained blocked on Google Photos after Alciné’s tweet in 2015. “Image
labeling technology is still early and unfortunately it’s nowhere near
perfect,” said the rep.
The categories are still available on other
Google services, though, including the Cloud Vision API it sells to
other companies and Google Assistant.
It may seem strange that Google, a company that’s
generally seen as the forerunner in commercial AI, was not able to come
up with a more complete solution to this error. But it’s a good reminder
of how difficult it can be to train AI software to be consistent and
robust. Especially (as one might suppose happened in the case of the
Google Photos mistake) when that software is not trained and tested by a
diverse group of people.
It’s not clear in this case whether the Google Photos
algorithm remains restricted in this way because Google couldn’t fix the
problem, didn’t want to dedicate the resources to do so, or is simply
showing an overabundance of caution. But it’s clear that incidents like
this, which reveal the often insular Silicon Valley culture that has
tasked itself with building world-spanning algorithms, need more than
quick fixes.
No comments