Instagram Launches Process To Protect Teens From Sextortion Scams
Instagramâs rolling out some additional protection measures to combat sextortion scams in the app, while also providing more informational notes to help teens understand the implications of intimate sharing online.
First off, Instagramâs launching a new process that will blur DMs which are likely to contain nude images, as detected by its systems.
As you can see in this example, potential nudes will now be blurred by default for users under the age of 18. The process will not only protect users from exposure to such, but will also include warnings about replying, and sharing their own nude images.
Which may seem like a no-brainer, as in if you donât want your nudes to be seen by others, donât share them on IG. Or even better, donât take them at all, but for younger generations, nudes are, for better or worse, a part of how they communicate.
Yeah, Iâm old, and it makes no sense to me either. But given that this is now an accepted, and even expected sharing process in some circles, it makes sense for IG to add more warnings to help protect youngsters, in particular, from exposure.
And as noted, it will also help in sextortion cases:
âThis feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return.â
in addition, Instagram says that itâs also developing new technology to help identify where accounts may potentially be engaging in sextortion scams, âbased on a range of signals that could indicate sextortion behaviorâ. In such cases, Instagram will take action, including reporting users to NCMEC where deemed necessary.
Instagram will also display warnings when people go to share nude images in the app.
Instagramâs also testing pop-up messages for people who may have interacted with an account that itâs removed for sextortion, while itâs also expanding its partnership with Lantern, a program run by the Tech Coalition which enables technology companies to share signals about accounts and behaviors that violate their child safety policies.
The updates build on Instagramâs already extensive child protection tools, including its recently added processes to limit exposure to self-harm related content. Of course, teens can opt out of such measures, but Instagram also canât be responsible for all elements of protection and safety in this respect.
Instagram also has its âFamily Centerâ oversight option, so parents can keep tabs on their kidsâ activity, and in combination, there are now a range of options to help keep younger users safe in the app.
You can read more about Instagramâs new sextortion protection measures here.