Security firm, Sensity, reports that an artificial intelligence-powered bot was on the prowl which targeted women in July, subsequently leading to the creation of their “nude” versions, based on their available images from social media. The produced tampered images were then circulated in Telegram where it was both shared and traded.

A large portion of deepfake users—or 70 percent—were said to have come from Russia and other Eastern European countries, possibly influenced by the application’s strong marketing presence in the region’s biggest social media platform, VKontakte.

Scam-apps

Related: ‘Five Eyes’ alliance demands tech companies to install backdoor access to encrypted apps

Although nobody is immune to becoming prey to the app, a striking result was found which sees non-celebrity individuals being a majority target of the app and less on actual celebrities or influencers at 63% and 16%, respectively.

The bot makes use of an open-sourced version of DeepNude, which strips a target image off of clothing and is replaced by a revealing body part based on the algorithm’s estimation.

To counteract the issue that comes with fake images going around online, Adobe has come up with functionality that verifies the authenticity of an image whether it was edited, how it was edited, and by who.

Source: The Verge

Leave a comment

Your email address will not be published. Required fields are marked *