Artificial Intelligence (AI) is rapidly infiltrating every aspect of society. From helping determine who is hired, fired, granted a loan, or how long an individual spends in prison, decisions that have traditionally been performed by humans are rapidly made by algorithms.


THE FOLLOWING PUBLISHED PAPER – “Gender Shades: Intersectional Accuracy Disparities in
Commercial Gender Classification”

Authored by:
Joy Buolamwini – joyab@mit.edu
MIT Media Lab 75 Amherst St. Cambridge, MA 02139
Timnit Gebru – timnit.gebru@microsoft.com
Microsoft Research 641 Avenue of the Americas, New York, NY 10011

Link to read/downloadhttp://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf


(Note: the following white paper is not associated with any work/project by Persons Asset Class – Only for the intended purposes of research and knowledge sharing for the betterment of equitable systems.)


Abstract: Recent studies demonstrate that machine learning algorithms can discriminate based on classes like race and gender. In this
work, we present an approach to evaluate bias present in automated facial analysis algorithms and datasets with respect to phenotypic subgroups. Using the dermatologist approved Fitzpatrick Skin Type classification system, we characterize the gender and skin type distribution of two facial analysis benchmarks, IJB-A and Adience. We find that these datasets are overwhelmingly composed of lighter skinned subjects (79.6% for IJB-A and 86.2% for Adience) and introduce a new facial analysis dataset which is balanced by gender and skin type.We evaluate 3 commercial gender classification systems using our dataset and show that darker-skinned females are the most misclassified group (with error rates of up to 34.7%). The maximum error rate for lighter-skinned males is 0.8%. The substantial disparities in the accuracy of classifying darker females, lighter females, darker males, and lighter males in gender classification systems require urgent attention if commercial companies are to build genuinely fair, transparent and accountable facial analysis algorithms.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s