1. Home
  2. AI Eye
  3. Deepfake K-Pop porn, woke Grok, ‘OpenAI has a problem,’ Fetch.AI: AI Eye
Deepfake K-Pop porn, woke Grok, ‘OpenAI has a problem,’ Fetch.AI: AI Eye

Deepfake K-Pop porn, woke Grok, ‘OpenAI has a problem,’ Fetch.AI: AI Eye

0

Source: Coin Telegraph

AI Eye: 98% of deepfakes are porn — mostly K-pop stars — Grok is no edgelord, and Fetch.AI boss says OpenAI needs to “drastically change.”

AI image generation has become outrageously good in the past 12 months … and some people (mostly men) are increasingly using the tech to create homemade deepfake porn of people they fantasize about using pics culled from social media.

The subjects hate it, of course, and the practice has been banned in the United Kingdom. However, there is no federal law that outlaws creating deepfakes without consent in the United States.

Face-swapping mobile apps like Reface make it simple to graft a picture of someones face onto existing porn images and videos. AI tools like DeepNude and Nudeify create a realistic rendering of what the AI tool thinks someone looks like nude. The NSFW AI art generator can even crank out Anime porn deepfakes for $9.99 a month.

According to social network analytics company Graphika, there were 24 million visits to this genre of websites in September alone. You can create something that actually looks realistic, analyst Santiago Lakatos explains.

Such apps and sites are mainly advertised on social media platforms, which are slowly starting to take action, too. Reddit has a prohibition on nonconsensual sharing of faked explicit images and has banned several domains, while TikTok and Meta have banned searches for keywords relating to “undress.”

Around 98% of all deepfake vids are porn, according to a report by Home Security Heroes. We cant show you any of them, so heres one of Biden, Boris Johnson and Macro krumping.

Read more

Go to Source
Author: Andrew Fenton