Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Latest Breaking News

Showing Original Post only (View all)

Tennessee Hillbilly

(649 posts)
Tue Sep 10, 2024, 12:23 PM Sep 10

AI generates harsher punishments for people who use Black dialect [View all]

Source: Science News

Ask it and other artificial intelligence tools like it what they think about Black people, and they will generate words like “brilliant,” “ambitious” and “intelligent.” Ask those same tools what they think about people when the input doesn’t specify race but uses the African American English, or AAE, dialect, and those models will generate words like “suspicious,” “aggressive” and “ignorant.”

The tools display a covert racism that mirrors racism in current society, researchers report August 28 in Nature. While the overt racism of lynchings and beatings marked the Jim Crow era, today such prejudice often shows up in more subtle ways. For instance, people may claim not to see skin color but harbor racist beliefs, the authors write.

Such covert bias has the potential to cause serious harm. As part of the study, for instance, the team told three generative AI tools — ChatGPT (including GPT-2, GPT-3.5 and GPT-4 language models), T5 and RoBERTa — to review the hypothetical case of a person convicted of first-degree murder and dole out either a life sentence or the death penalty. The inputs included text the purported murderer wrote in either AAE or Standard American English (SAE). The models, on average, sentenced the defendant using SAE to death roughly 23 percent of the time and the defendant using AAE to death roughly 28 percent of the time.

Because these language models are trained on an enormous trove of online information, they shine a light on hidden societal biases, says Sharese King, a sociolinguist at the University of Chicago. The examples in this study “could tell us something about the broader sort of disparities we see in the criminal justice system.”


Read more: https://www.sciencenews.org/article/ai-punishments-black-dialect



This isn't surprising. But I'm not sure that "covert racism" is the right term, since so much of this racism isn't hidden.
15 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Latest Discussions»Latest Breaking News»AI generates harsher puni...