Predicting and quantifying racism and unequal treatment

Predicting and quantifying racism and unequal treatment

A new paper published this week in the Proceedings of the National Academy of Sciences cuts to the heart of messy social interactions with a set of computational models to quantify and predict unequal treatment. Hsu and post-doctoral researcher Adrianna C. Jenkins–now an assistant professor at the University of Pennsylvania–drew on social psychology and behavioral economics in a series of lab experiments and analyses of field work. (The paper was co-written by Berkeley researcher Pierre Karashchuk and Lusha Zhu of Peking University.)

“There’s been lots of work showing that people have stereotypes and that they treat members of different social groups differently,” said Jenkins, the paper’s lead author. “But there’s quite a bit we still don’t know about how stereotypes influence people’s behavior.”

It’s more than an academic issue: University admission officers, for example, have long struggled with how to fairly consider an applicant’s race, ethnicity, or other qualities that may have presented obstacles to success. How much weight should be given, for example, to the obstacles faced by African Americans compared with those faced by Central American immigrants or women?

While these are much larger questions, Hsu said the paper’s contribution is to improve how to quantify and compare different discrimination across different social groups–a common challenge facing applied researchers.

Source: A model to predict and quantify racism, sexism, and other unequal treatment