Instagram will, in a few weeks, start testing new tools to combat “sextortion”a form of blackmail involving intimate photographs sent online, according to the BBC.
“Sextortion” starts with receiving a nude picture. PHOTO Shutterstock
One of the tools that will be tested is “protection against nudity”, which fades out nude images in direct messages. Protection will be enabled by default for people under 18.
At the same time, pop-up windows that will direct potential victims to assistance will also be tested.
What does “sextortion” mean?
Governments around the world have warned of the growing threat posed to young people by so-called “sextortion”.
For those who fall victim to this phenomenon, it all starts with receiving a nude picture. Targeted individuals are invited to send a picture in return, only to be threatened with having the image made public if they do not respond to the blackmailer's demands.
Two Nigerian men on trial in one such case pleaded guilty Wednesday to sexually blackmailing US teenagers and young men, one of whom committed suicide.
The suicide of a teenager in Australia was also linked to such a phenomenon.
There are horrifying cases of pedophiles using intimate photos sent online to coerce and abuse children, reports the BBC.
How Instagram is trying to fight a 'horrible' crime
Many of the “sextortion” cases that Instagram has identified on its platform are initiated by sophisticated criminal gangs who want to make money through blackmail.
It's a “horrific” crime, as the platform has described it. Instagram's nudity protection system, first revealed in January, uses artificial intelligence (AI) that runs entirely on the user's device to detect nude images in direct messages and to give users a choice whether to view them or not.
Instagram stated that this measure is designed “not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images instead”.
The system, which is optional for adults, however, does not automatically report nude images to the company, although users will be reminded that they can block and report accounts if they wish, Instagram points out.
When the system detects the sending of nude images, the user will be directed to safety tips, including reminders that recipients may take screenshots or transmit images without the sender's knowledge.
The platform now announces that it will also implement measures that detect signs that a user might be a potential sex blackmailer and make it harder for them to interact with others.
The measures also aim to send message requests from potential “sextortion” accounts directly to the hidden folder.
Notices will also be displayed encouraging people who are already messaging potential victims' accounts to report any threats about sharing their intimate photos.
Another measure is to hide the “Message” button when a teen views a potential sextortion account, even if they're already logged in.