Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
LGBT
Related: About this forumFBI warns that criminals can use AI to create fake sexual images for blackmail
The FBI has warned that artificial intelligence (AI) technology has enabled malicious actors to take non-consenting peoples personal photos and videos and insert their likenesses into pornographic deepfake images in order to harass or sextort them.
Minor children and adults have had their images taken from their video chats, social media accounts, or the open internet and then incorporated into sexually explicit content through the aid of AI technology, the FBI wrote in a Monday announcement.
These images are then shared on social media and pornographic websites or are sometimes sent directly to victims families or social media contacts, unless the victims pay a ransom or agree to other actions, like supplying personal information or other real-life explicit images.
Some victims may not realize that they appear in these digitally manipulated images until someone else first sees them online. Even if victims comply with an extortionists demands, it can be difficult to completely remove these images from the internet. In fact, the images may be continually re-shared for years to come, newly embarrassing and re-victimizing people again and again something that can result in post-traumatic stress disorder (PTSD) and continued financial exploitation.
Minor children and adults have had their images taken from their video chats, social media accounts, or the open internet and then incorporated into sexually explicit content through the aid of AI technology, the FBI wrote in a Monday announcement.
These images are then shared on social media and pornographic websites or are sometimes sent directly to victims families or social media contacts, unless the victims pay a ransom or agree to other actions, like supplying personal information or other real-life explicit images.
Some victims may not realize that they appear in these digitally manipulated images until someone else first sees them online. Even if victims comply with an extortionists demands, it can be difficult to completely remove these images from the internet. In fact, the images may be continually re-shared for years to come, newly embarrassing and re-victimizing people again and again something that can result in post-traumatic stress disorder (PTSD) and continued financial exploitation.
MORE
https://www.lgbtqnation.com/2023/06/fbi-warns-that-criminals-can-use-ai-to-create-fake-sexual-images-for-blackmail/
FBI PSA HERE
https://democraticunderground.com/100217995442
2 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
FBI warns that criminals can use AI to create fake sexual images for blackmail (Original Post)
icymist
Jun 2023
OP
Phil01
(189 posts)1. Only criminals do this, not national governments? Also you don't need AI for this.
C_U_L8R
(49,112 posts)2. Have they ever done an image search?
Who needs AI? Poor celebrities have had their likenesses pasted into sex scenes for years.
