Instagram said it plans to test a new feature in an effort to curb sextortion on its platform.
The “nudity protection” feature, which has several functions, is designed to help prevent users from receiving unwanted sexual images and encourage “people to think twice before sending nude images,” according to a blog post the platform published Thursday.
The feature, which will automatically be turned on for users under age 18, will blur images that are detected as containing nudity. There will also be a warning that reads, “Photo may contain nudity.”
Additionally, the platform said it will push a message to users “reminding them to be cautious when sending sensitive photos, and that they can unsend these photos if they’ve changed their mind.” They will also be directed to Meta’s Safety Center and support helplines.
“These updates build on our longstanding work to help protect young people from unwanted or potentially harmful contact,” Instagram said in the blog post. “We default teens into stricter message settings so they can’t be messaged by anyone they’re not already connected to, show Safety Notices to teens who are already in contact with potential scam accounts, and offer a dedicated option for people to report DMs that are threatening to share private images.”
The platform’s announcement comes as efforts to regulate social media continue to ramp up across the U.S. amid concerns from some that the platforms don’t do enough to keep kids safe online.
Last month, Florida Gov. Ron DeSantis, a Republican, signed a bill that bans children under 14 from having social media accounts. The law is expected to face legal challenges over claims that it violates the First Amendment.
In December, more than 200 organizations sent a letter urging Senate Majority Leader Chuck Schumer, D-N.Y., to schedule a vote on the Kids Online Safety Act, or KOSA, which seeks to create liability, or a “duty of care,” for apps and online platforms that…
Read the full article here