Privacy experts are warning about deepfake pornography being more accessible due to advances in AI technology.
Digital Desk: Researchers and privacy advocates have expressed concern over the alarming surge in popularity of applications and websites that use artificial intelligence to undress women in images, according to a Bloomberg story. An alarming increase in non-consensual pornography fueled by advances in artificial intelligence was highlighted by Graphika, a social network research business, which discovered that a staggering 24 million individuals visited these undressing websites in September alone.
Popular social networks have been used by the so-called "nudify" services for marketing purposes; since the year began, the number of links promoting undressing applications on sites like X and Reddit has increased by over 2,400 percent. These businesses, which mostly target women, use AI technology to virtually undress people. Given that the photos are often taken from social media without the subjects' knowledge or agreement, their proliferation poses significant ethical and legal issues.
It's a concerning trend that might lead to harassment because as some advertisements suggest customers could create nude images and send them to the digitally undressed person. Google has responded by announcing its policy on sexually explicit content in advertisements and announcing the active removal of such content. X and Reddit, on the other hand, have not yet replied to requests for comments.
Privacy experts are warning about deepfake pornography being more accessible due to advances in AI technology. The Electronic Frontier Foundation's director of cybersecurity, Eva Galperin, has observed a change in the way common people are employing these tools on common targets, such as college and high school kids. It's possible that many victims are still ignorant of these altered photos, and for those who are, getting help from law police or filing a lawsuit may be difficult.
In spite of mounting concerns, deepfake pornography is still not expressly forbidden by federal law in the United States. This past week saw the first conviction under legislation prohibiting the deepfake production of child sexual abuse material, with a child psychiatrist in North Carolina receiving a 40-year term for utilising undressing apps on patient images.
TikTok and Meta Platforms Inc. have responded to the worrying trend by blocking terms related to these undressing applications. The word "undress" may be linked to content that violates TikTok's rules, the platform alerts users to this possibility, and Meta Platforms Inc. declines to comment more.
The growing body of deepfake pornography raises ethical and legal concerns that highlight the critical need for comprehensive legislation to safeguard people from the harmful and non-consensual usage of AI-generated content.
Leave A Comment