Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region Forums'There is no standard': investigation finds AI algorithms objectify women's bodies
https://www.theguardian.com/technology/2023/feb/08/biased-ai-algorithms-racy-women-bodiesThere is no standard: investigation finds AI algorithms objectify womens bodies
Guardian exclusive: AI tools rate photos of women as more sexually suggestive than those of men, especially if nipples, pregnant bellies or exercise is involved
by Gianluca Mauro and Hilke Schellmann
Wed 8 Feb 2023 06.00 EST
Images posted on social media are analyzed by artificial intelligence (AI) algorithms that decide what to amplify and what to suppress. Many of these algorithms, a Guardian investigation has found, have a gender bias, and may have been censoring and suppressing the reach of countless photos featuring womens bodies.
Two Guardian journalists used the AI tools to analyze hundreds of photos of men and women in underwear, working out, using medical tests with partial nudity and found evidence that the AI tags photos of women in everyday situations as sexually suggestive. They also rate pictures of women as more racy or sexually suggestive than comparable pictures of men. As a result, the social media companies that leverage these or similar algorithms have suppressed the reach of countless images featuring womens bodies, and hurt female-led businesses further amplifying societal disparities.
Even medical pictures are affected by the issue. The AI algorithms were tested on images released by the US National Cancer Institute demonstrating how to do a clinical breast examination. Googles AI gave this photo the highest score for raciness, Microsofts AI was 82% confident that the image was explicitly sexual in nature, and Amazon classified it as representing explicit nudity.
Pregnant bellies are also problematic for these AI tools. Googles algorithm scored the photo as very likely to contain racy content. Microsofts algorithm was 90% confident that the image was sexually suggestive in nature.
This is just wild, said Leon Derczynski, a professor of computer science at the IT University of Copenhagen, who specializes in online harm. Objectification of women seems deeply embedded in the system.
more
(Baked in sexism. And Microsoft is letting ai loose in bing.)
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
3 replies, 633 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (2)
ReplyReply to this post
3 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
'There is no standard': investigation finds AI algorithms objectify women's bodies (Original Post)
cbabe
Feb 2023
OP
Midnight Writer
(21,768 posts)1. AI gets religious?
Stargazer99
(2,585 posts)2. Could the problem be that it is mostly males that write the programing?
CrispyQ
(36,470 posts)3. Does this surprise anyone? -nt