Menu
fist

Math Isn’t Pure: Algorithms Discriminate Against Women And POC

Algorithms — mathematical programs that create sets of rules to process and sort large amounts of information — are being used to scan applications for housing, employment, insurance and credit.  But those algorithms discriminate, or at least they can.*

Cathy O’Neil — whose blog name mathbabe inspires instant fangirl love — recently published her book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, explores the implications of our increasing acceptance of and reliance on algorithms.  Cathy has lots of interesting things to say.  Before you read the book, you should also check out this video of her Ford Foundation presentation here.

Algorithms Discriminate Because Of Their Design
But back to how algorithms discriminate.  In her well-reviewed and enthusiastically received book, Cathy describes how the Big Data industry sits too far apart from the real world and the lived experiences of people in it, substituting “data trails” for the nuance of life:

More and more I worried about the separation between technical models and real people, and about the moral repercussions of that separation. If fact, I saw the same pattern emerging that I’d witnessed in finance: a false sense of security was leading to widespread use of imperfect models, self-serving definitions of success, and growing feedback loops.

This, alone, should by ringing alarm bells.  In the law, for example, the separation of law from real life means that the experiences of women and people of color are particularly overlooked because this “separation” is based on normed assumptions derived from white maleness.  See, for example, Sonia Sotomayor’s excellent dissent about how the majority’s treatment of policing ignored the realities of “black and brown parents” and their children.

But we don’t need to make that academic leap.  Cathy provides powerful examples that illustrate just how concretely this is working to the disadvantage of women and people of color. For example, algorithms about who will do well in engineering, if based on data from historically-successful employees, will be based on data almost exclusively driven by men.  So the algorithm will replicate a preference for men.  The input is flawed, so the result is flawed.


The input is flawed, so the result is flawed.


The use of big data and algorithms isn’t inherently nefarious.  In some cases, its use can be driven by altruistic goals; for example, see this interesting discussion of how data is being used to try and combat persistent homelessness.  But if the patterns algorithms aren’t built with a nuanced and careful view (and the one discussed in the homelessness context may be), then the answers provided will be no more nuanced or careful.  They are only as reliable as the design.

Old Boys Clubs and Algorithms
resume-algorithm-discriminate
In others, algorithms might be used in the hopes of simply saving time.  I’m reminded of a recent conversation with a friend working for a department of the federal government.  She was explaining that the algorithm used to sort job applications was very crude — its most preferred matches for a job are those that use most closely the language of a job posting.  As a result, people in the know — those who have connections to others already working for this department — submit resumes that are pages-long, wordy messes that clunkily reproduce each word and phrase in the posting somewhere in the resume.

Imagine resumes of pages-long, redundant awkward text.  That isn’t what we’re taught a resume should look like.  In fact, we know that resumes like that would be thrown away by most hiring departments on sight.  So in the context of this federal government department, only those with inside connections know how to game the algorithm.  Everyone else, who is following the rules as they understand them and is submitting pithy, bullet-pointed resumes, is actually handicapping themselves in the process.  Where women and people of color have historically had fewer numbers (if any) on the inside, and where old boys clubs have historically protected the old (white) boys, who is getting left behind?


*  Yes, this is the second post in just as many weeks about technology, but if we feminists don’t pay attention to the invisible forces shaping our world, we’re doing ourselves a disservice.

Katherine Kimpel

Katherine Kimpel

Kate Kimpel is the Senior Editor of Shattering the Ceiling and is also an accomplished civil rights lawyer. She represents women and people of color in discrimination cases (and other kinds of employment and civil rights matters).  When not lawyering, she likely is bragging about her hound dog Ulysses, inventing cocktails to serve at her next dinner party, or convincing her husband to watch reruns of a Joss Whedon television show (any of them will do). 

shard4 shard5 shard7 shard9 shard10 shard11