Google wants to reduce product bias
Google wants to reduce product bias

Google told Reuters this week that it is developing an alternative to the industry's standard way of classifying skin tone.

More and more technical researchers and dermatologists are arguing that this method is insufficient to assess whether a product is biased towards people of color.

There is a six-color scale called the FST that dermatologists have used since the 1970s.

Tech companies rely on it to assess and measure whether products like facial recognition systems or the heart rate sensors in smart watches are effective on different skin tones.

Critics say FST ignores black diversity. The scale includes four types of white leather, one for black and one for brown.

Last October, researchers from the US Department of Homeland Security recommended abandoning the FST for assessing facial recognition. This is due to the fact that the standard cannot depict the colorimeter of different groups of people very well.

Google said we're looking for better metrics. We are looking for alternative and more comprehensive ways in which to develop our products. We work on this work in collaboration with scientific and medical experts and groups working with communities of color.

This debate is part of a larger issue of racism and diversity in the tech industry, where there are more white workers than any other industry.

Google sets new standards for skin tone:

Most importantly, make sure the technique works for all skin tones and all ages and genders. In fact, new products, generally backed by artificial intelligence, are extending into sensitive areas such as healthcare and law enforcement.

The Company understands that batch products not represented in the research and test data may have defects.

When Google announced in February that the cameras of some Android phones can measure the pulse with their fingertips, it said the average reading error was 1.8%, with the user's skin being clear or dark.

The company has ensured that skin type will not affect the results of the background filter feature in Google Meet. The next network dermatology detection tool is not unofficially called Derm Assist.

Until recently, tech companies were not interested.

The Unicode consortium responsible for emojis named FST 2014 as the basis for the adoption of five skin tones after yellow.

A 2018 study called Gender Shades found that facial screening systems often mislead people with darker skin.

In a study conducted in April to test artificial intelligence to detect deepfakes, Facebook researchers wrote that the FST standard does not take into account the diversity of brown and black skin tones.



Save 80.0% on select products from RUWQ with promo code 80YVSNZJ, through 10/29 while supplies last.

HP 2023 15'' HD IPS Laptop, Windows 11, Intel Pentium 4-Core Processor Up to 2.70GHz, 8GB RAM, 128GB SSD, HDMI, Super-Fast 6th Gen WiFi, Dale Red (Renewed)
Previous Post Next Post