Meta’s ‘Take It Down’ Tool Helps Minors Remove Explicit Photos From Web

Meta Platforms Inc. has funded the creation of a tool that helps minors remove nude, partially nude or sexually explicit images of themselves from the internet in partnership with the National Center for Missing & Exploited Children, a nonprofit child-protection group.

(Bloomberg) — Meta Platforms Inc. has funded the creation of a tool that helps minors remove nude, partially nude or sexually explicit images of themselves from the internet in partnership with the National Center for Missing & Exploited Children, a nonprofit child-protection group.

Take It Down is a free service, announced Monday, that lets minors or their parents anonymously submit photos or videos they fear might be uploaded to the internet or that have already been distributed online. The photos can be submitted to a web-based tool that will convert the images into digital fingerprints known as hashes, which will be sent to NCMEC and shared with platforms. The social media sites will use hash-matching technology to find and block any attempts to upload the original images.

Participating platforms include Facebook, Instagram, Pornhub, OnlyFans and Yubo.

The tool has been designed to combat the rising problem of ‘sextortion,’ where children are coerced or deceived into sharing intimate images with another person online, then threatened or blackmailed with the prospect of having those images published on the web. Some offenders are motivated to extract even more explicit images from the child while others are seeking money. Reports of sextortion to the nonprofit center, whose CyberTipline acts as a clearing house for child sexual abuse material distributed online, doubled from 2019 to 2021. And teenage boys have become the most common targets.

The introduction of the tool comes as Big Tech companies are facing increasing pressure from lawmakers in the US to better protect children online, with several child-focused safety bills looming. These follow the UK’s Online Safety Bill and an EU proposal that would require companies to detect grooming and child sexual abuse material. 

“Being threatened in this way puts people in a very vulnerable position and can have devastating consequences for them,” said Antigone Davis, Meta’s global head of safety. The tool gives some control back to people in a situation where they can otherwise feel desperate and helpless, Davis said.

Davis didn’t specify how much Meta invested in developing the tool, but the company donated more than $2 million in cash and in-kind gifts to the children’s center in 2021, making it one of the largest donors. 

While many of the images will be considered child sexual exploitation material, which means online platforms already have an obligation to remove them once reported, some of the images may not meet that threshold. Still, the distribution of those photos can be extremely traumatizing for the subject. 

Take It Down hashes the images in the browser, so they don’t leave the device of the child or parent. If the extorter tries to upload the original images, the platform’s hash-matching technology will detect a match and send the newly uploaded image to a content moderator to review. This is to ensure the photo violates the platform’s policies and the tool isn’t being misused to remove other types of images, for example an unflattering but fully clothed picture of a minor.

Meta said it will ingest new hashes multiple times a day, so it can be ready to block images very quickly. Other companies can do it daily or weekly, Davis said, depending on resources.

“This is an incredibly good new tool but it’s not a panacea,” Davis said, adding that preventing grooming and sextortion requires a range of technical measures that restrict online interactions between minors and adults they don’t know.

In addition to the launch partners, NCMEC is working with about half a dozen others who plan to start scanning for hashed images in the coming months, Gavin Portnoy, a spokesperson for the children’s center. 

“With some positive peer pressure, I’m sure we’ll have more,” he added. 

Take It Down follows the launch of a similar tool, StopNCII.org, also backed by Meta, designed to stop the distribution of adult non-consensual intimate imagery, known as ‘revenge porn.’ 

More stories like this are available on bloomberg.com

©2023 Bloomberg L.P.