Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

cbabe

(3,544 posts)
Wed Feb 8, 2023, 01:55 PM Feb 2023

'There is no standard': investigation finds AI algorithms objectify women's bodies

https://www.theguardian.com/technology/2023/feb/08/biased-ai-algorithms-racy-women-bodies

‘There is no standard’: investigation finds AI algorithms objectify women’s bodies

Guardian exclusive: AI tools rate photos of women as more sexually suggestive than those of men, especially if nipples, pregnant bellies or exercise is involved

by Gianluca Mauro and Hilke Schellmann
Wed 8 Feb 2023 06.00 EST

Images posted on social media are analyzed by artificial intelligence (AI) algorithms that decide what to amplify and what to suppress. Many of these algorithms, a Guardian investigation has found, have a gender bias, and may have been censoring and suppressing the reach of countless photos featuring women’s bodies.



Two Guardian journalists used the AI tools to analyze hundreds of photos of men and women in underwear, working out, using medical tests with partial nudity and found evidence that the AI tags photos of women in everyday situations as sexually suggestive. They also rate pictures of women as more “racy” or sexually suggestive than comparable pictures of men. As a result, the social media companies that leverage these or similar algorithms have suppressed the reach of countless images featuring women’s bodies, and hurt female-led businesses – further amplifying societal disparities.

Even medical pictures are affected by the issue. The AI algorithms were tested on images released by the US National Cancer Institute demonstrating how to do a clinical breast examination. Google’s AI gave this photo the highest score for raciness, Microsoft’s AI was 82% confident that the image was “explicitly sexual in nature”, and Amazon classified it as representing “explicit nudity”.

Pregnant bellies are also problematic for these AI tools. Google’s algorithm scored the photo as “very likely to contain racy content”. Microsoft’s algorithm was 90% confident that the image was “sexually suggestive in nature”.

This is just wild,” said Leon Derczynski, a professor of computer science at the IT University of Copenhagen, who specializes in online harm. “Objectification of women seems deeply embedded in the system.”

…more…

(Baked in sexism. And Microsoft is letting ai loose in bing.)

3 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
'There is no standard': investigation finds AI algorithms objectify women's bodies (Original Post) cbabe Feb 2023 OP
AI gets religious? Midnight Writer Feb 2023 #1
Could the problem be that it is mostly males that write the programing? Stargazer99 Feb 2023 #2
Does this surprise anyone? -nt CrispyQ Feb 2023 #3
Latest Discussions»General Discussion»'There is no standard': i...