Google Searching For New Skin Tone Measures To Reduce Bias

Google is developing an alternative to the industry standard for classifying skin tones. Technology researchers and dermatologists say today’s process does not access whether products are biased against people of color.

Today, tech companies now rely the Fitzpatrick Skin Type (FST), a six-color scale that dermatologists have used since the 1970s. It categorizes people and measures whether products such as facial-recognition systems or smartwatch heart-rate sensors perform equally well across skin tones.

Google has been “quietly” working on better measures. "We are working on alternative, more inclusive, measures that could be useful in the development of our products, and will collaborate with scientific and medical experts, as well as groups working with communities of color," Google told Reuters.

Critics of the FST process point to four categories for white skin, one for black and one for brown skin. Reuters reports that it disregards diversity among people of color. The U.S. Department of Homeland Security, during a conference in October, recommended abandoning FST for evaluating facial recognition, citing poor representation of an appropriate color range in diverse populations.

Reuters points to a video from February where Google announced during The Checkup for Google Health that cameras on some Android phones could measure pulse rates via a fingertip, but said readings on average would err by 1.8%, regardless of whether users had light or dark skin.

Earlier this month, a group of senate Democrats urged Alphabet, Google’s parent company to investigate how its products and policies may be harming Black people.

In a letter to Alphabet and Google CEO Sundar Pichai, and other executives, Senators Cory Booker of New Jersey, Ron Wyden of Oregon, Mark Warner of Virginia, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut showed concern about bias and discrimination in Google products and the way the company has handled workplace diversity.

“Google Search, its ad algorithm, and YouTube have all been found to perpetuate racist stereotypes and white nationalist viewpoints,” they wrote in the letter.

They used this example, “last year it was found Google’s Cloud Vision image recognition tool was labeling images of a thermometer held by light skinned people as ‘electronic device’ and while labeling them as ‘gun’ when held by dark skinned people.”

The letter goes on to say that the Senators were also troubled to learn that Google developers did not listen to both internal and external advocates who warned that a new app to identify skin conditions did not use a sufficiently diverse training dataset to train and would not be effective on people with dark skin.

Microsoft acknowledged FST's imperfections, while Apple said it tests on humans across skin tones using a variety of measures. Garmin said due to wide-ranging testing it believes readings are reliable, according to Reuters.

2 comments about "Google Searching For New Skin Tone Measures To Reduce Bias".
Check to receive email when comments are posted.
  1. Craig Mcdaniel from Sweepstakes Today LLC, June 21, 2021 at 2:06 p.m.

    What is sad is people with disabilities is not even mentioned. If a person is in a wheelchair or has a prosthesis, well Google doesn’t care about you. Millions of people have some form of actual and physical disability. Isn’t it time to address this group with the same respect that it deserves?  

  2. John Grono from GAP Research, June 21, 2021 at 6:19 p.m.

    Totay agree Craig.

    Our state government appointed a 'disability advocate' (my term - not the official term) who is blind.   That gentleman is an inspiration and cudos to the government.

Next story loading loading..