Sony just found additional layers of internet skin tone bias

0
36

More than a year after Google introduced the Monk Skin Tone (MST) Scale to tame internet bias, online platforms are still awash with skin color bias, according to new research by Sony AI.

The MST, the result of a collaboration between Google and Harvard sociologist Ellis Monk, improved on the unidimensional six-color Fitzpatrick scale and attempted to offer a more inclusive perception of skin color through an array of ten standard colors. But the scale still leaves much to be desired when it comes to discrimination emanating from the skin tones that most AI algorithms are fed with.

Sony’s new research paper (pdf) points out additional layers of bias related to apparent skin color within computer vision datasets and models, noting that AI datasheets and model cards still leave a lot of room for discrimination against under-represented groups. Computer vision is a subset of AI that focuses on enabling computers to identify and interpret the world around them in images and videos.

Minimizing algorithms unabridged with bias

The study found that existing skin color scales affects the way AI classifies people and emotions. Manipulating skin color to have a lighter skin tone or a redder skin hue, for instance, leads to a higher likelihood that non-smiling individuals are considered by AI to be smiling. Conversely, a darker skin tone or a yellower skin hue is more likely to lead AI to classify smiling individuals as non-smiling.

Such AI classifiers, the study says, also tend to predict people with lighter skin tones as more feminine, and people with redder skin hue as more smiley. “That means there’s a lot of bias that’s not actually being detected,” said Alice Xiang, Sony’s global head of AI ethics, in the paper.

This means that AI model bias not only exists for skin tone, but also for skin hue. “We found quite plainly that people with a light red skin tone will tend to have a higher performance or be favored in the algorithm and people with a dark yellow skin will have a lower performance,” said William Thong, one of the computer vision research scientists at Sony AI.

The researchers believe that by collecting skin tone annotations, they could help AI developers identify bias in facial recognition, image captioning, person detection, skin image analysis in dermatology, face reconstruction, and even deep fake detection—and eliminate it.

While Google is already using the MST in its products, search results on Google Images are still classified with light to dark options, with regard to the scale’s 10 color shades. Google Photos, on the other hand, has reportedly mislabeled pictures of black people and generated stereotyped images.

What is Sony’s ‘Hue Angle’ in skin color classification?

To help solve the problem, Sony is introducing what it calls the ‘Hue Angle’, a new dimension of computer vision that spans the range from red to yellow, rather than classifying skin color in types as is done with the Fitzpatrick scale.

This, the researchers say, involves a multidimensional measure of skin color by quantifying to what extent common image datasets are skewed towards light-red skin color and under-represent dark-yellow skin color, and how generative models trained on these datasets reproduce similar biases.

The tool aims to provide a scale for assessment for diversity standards of existing data, provide the key to skin tone bias mitigation in AI datasets and models, and potentially integrate with new AI projects to enable developers monitor the skin color diversity during data collection and in groups representation.

The researchers say they drew motivation from personal experiences in buying cosmetics, as well as the works of Brazilian photographer Angelica Dass in illustrating the diversity of human skin color.

In her Humanae project, Dass took photos of 4,000 people in London and matched their skin colors with the Pantone printing color chart to codify a unique chromatic inventory. “Using this scale, I am sure that nobody is ‘black,’ and absolutely nobody is ‘white’,” she told Newsweek.

Though Dass cannot take portraits of everyone in the world for analysis, her work, together with that of Google and Sony, could one day prove critical to achieving an internet that is inclusive.

LEAVE A REPLY

Please enter your comment!
Please enter your name here