Meta joins to combat the non-consensual spread of intimate images on the internet

Meta joins, the platform created by the UK Revenge Porn Helpline for combat the non-consensual dissemination of intimate images on the internet (NCII). was developed on principles of security and privacy. Through a system of numerical identifiers (called hashes), it assigns unique fingerprints that help detect and prevent the circulation of intimate images.

“The non-consensual dissemination of intimate images can have a devastating effect on a person’s life. Through pilot programs, since 2018 we have been developing technologies to give people more control and privacy over their images and privacy. will bring those technologies to more people, and help more companies to help stop NCII,” he said. Daniele KleinerWell-being and Safety Manager for Meta in Latin America.

With the support of more than 50 organizations worldwide, This is the first global initiative of its kind.. In Argentina, the project has the support of Faro Digital, a non-governmental organization dedicated to promoting Internet rights.

“The dissemination of intimate images without consent is one of the problems that most worries and anguishes in digital environments. In our workshops, both adolescents and adults constantly tell us about situations where they either threaten to spread intimate material or directly publish it in order to manipulate and harm. Keep in mind that this situation is serious, real and its consequences can be very hard for those who suffer. It is important that all of society make this type of violence visible, become aware of its dimensions, consequences and generate various strategies to prevent, manage cases and support those who suffer”, he explained Lucia FainboimDirector of Faro Digital.

In this way, the initiative seeks to empower people to thwart attempts to violate their privacy on digital platforms such as Facebook and Instagram.

How does it work? If a person is concerned that their intimate images have been or may be posted on platforms like Facebook or Instagram, they can create a case on to proactively detect them.

The tool uses token-generating technology that assigns a unique “hash” (a numerical code) to an image, creating a secure fingerprint. Technology companies that participate in will receive the “hash” and can use it to detect if someone has shared images or is trying to share them on their platforms.

Although use the “hash” to identify images that someone has shared or is trying to share without consent, the original image never leaves the device of the person initiating the case. Only the hashes, and not the images themselves, are shared with This feature prevents the possible spread of that content and keeps those images safe on the owners’ device or cloud. en for those over 18 years of age who think that an intimate image of them may be shared, or has already been shared without your consent. For those under 18, there are other resources and organizations that can help, such as the National Center for Missing & Exploited Children (NCMEC).

Fingerprint available for everyone is available to people around the world. It will be operated by the UK’s Revenge Porn Helpline, an organization that has helped thousands of NCII victims and removed more than 90% of the content reported to it since it was founded in 2015.

Stop is based on technology developed for NCII pilots from Facebook and Instagramwhich started in 2018 and helps victims proactively stop the proliferation of their intimate images and not just delete them after the fact.

We want to give thanks to the author of this write-up for this amazing content

Meta joins to combat the non-consensual spread of intimate images on the internet