Instagram Launches Process To Protect Teens From Sextortion

Cyber1Defense Communication Ltd > Blog > Uncategorized > Instagram Launches Process To Protect Teens From Sextortion
  • Posted by: Evans Asare
Instagram Launches Process To Protect Teens From Sextortion Scams

Instagram launches process to protect teens from sextortion scams.

Instagram’s rolling out some additional protection measures to combat sextortion scams in the app while also providing more informational notes to help teens understand the implications of intimate sharing online.

First off, Instagram’s launching a new process that will blur DMs, which are likely to contain nude images, as detected by its systems.

As you can see in this example, potential nudes will now be blurred by default for users under the age of 18. The process will not only protect users from exposure to such, but will also include warnings about replying, and sharing their own nude images.

Which may seem like a no-brainer, as in, if you don’t want your nudes to be seen by others, don’t share them on Instagram. Or even better, don’t take them at all, but for younger generations, nudes are, for better or worse, a part of how they communicate.

Yeah, I’m old, and it makes no sense to me either. But given that this is now an accepted and even expected sharing process in some circles, it makes sense for IG to add more warnings to help protect youngsters, in particular, from exposure.

And as noted, it will also help in sextortion cases.

“This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return.”

In addition, Instagram says that it’s also developing new technology to help identify where accounts may potentially be engaging in sextortion scams, “based on a range of signals that could indicate sextortion behavior.”. In such cases, Instagram will take action, including reporting users to NCMEC where deemed necessary.

Instagram will also display warnings when people share nude images on the app.

Instagram’s also testing pop-up messages for people who may have interacted with an account that’s been removed for sextortion, while it’s also expanding its partnership with Lantern, a program run by the Tech Coalition that enables technology companies to share signals about accounts and behaviors that violate their child safety policies.

Author: Evans Asare

Leave a Reply

3 Comments