New Facebook Tech Set to Curb ‘Revenge Porn’

0
51
New Facebook Tech Set to Curb 'Revenge Porn'
Credit: Shutterstock

Facebook announced March 15 that it will take a proactive step in protecting users from non-consensual sexual images using machine learning and AI technology.

Antigone Davis, Global Head of Safety, wrote that when someone’s intimate images are shared without their permission it can be devastating.

New Facebook Tech Set to Curb 'Revenge Porn'
Credit: Shutterstock

“To protect victims, it’s long been our policy to remove non-consensual intimate images (sometimes referred to as revenge porn) when they’re reported to us — and in recent years we’ve used photo-matching technology to keep them from being re-shared,” Davis wrote. “To find this content more quickly and better support victims, we’re announcing new detection technology and an online resource hub to help people respond when this abuse occurs.”

Facebook wrote that the technology will be able to find non-consensually shared sexual images before anyone reports it, and feels that this is important for two reasons.

“Often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared,” Davis wrote. “A specially-trained member of our Community Operations team will review the content found by our technology. If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission. We offer an appeals process if someone believes we’ve made a mistake.”

The company will launch a support hub on its platforms, “Not Without My Consent,” in its Safety Center that we developed together with experts.

Victims can find organizations and resources to support them, including steps they can take to remove the content from our platform and prevent it from being shared further. The company wrote that it wants to make it easier for victims to report these instances, working in tandem with Revenge Porn Helpline (UK), Cyber Civil Rights Initiative (US), Digital Rights Foundation (Pakistan), SaferNet (Brazil) and Professor Lee Ji-yeon (South Korea) to create a victim toolkit.

Radha Iyengar, Head of Product Policy Research and Karuna Nain, Global Safety Policy Programs Manager, wrote that the sharing of intimate images online can have serious emotional and physical consequences for the people whose photographs were posted.

“Sometimes called ‘revenge porn,’ it’s really a form of sexual violence that can be motivated by an intent to control, shame, humiliate, extort and terrorize victims,” she wrote.

“Discovering intimate images of yourself online when you didn’t consent to them being shared is devastating. That’s why we’ve taken a careful, research-based approach that concentrates on the victims — what they experience and how we can better protect them.”

Facebook has conducted research and studies over the last year to review and improve our response to the sharing of what we call non-consensual intimate images anywhere on Facebook, Messenger or Instagram.

Iyengar wrote Facebook tried to understand the experience of victims, how victims reported their experience, what barriers arose when they made a report and what support or tools they needed to feel safe on our platform.

“We interviewed victim support advocates and victims themselves from around the world, including Kenya, Denmark and the U.K,” Iyengar wrote. “Last summer we brought together over 20 academics and non-profit leaders from 10 countries to improve our tools and understanding of how to support victims. This included educational information about NCII, information on where victims can go for help, and psychosocial support for those who had been victimized. For everyone, we instructed them on what precautions people can take on Facebook and other platforms to reduce their chances of being victimized.”

Victims whose images were shared or were threatened, stated they felt violated, angry and embarrassed throughout Facebook’s study. They are scared and worried that their family, friends, and co-workers will see the images. Facebook began a discussion with victims to see what it could do to make the user experience easier for reporting non-consensual intimate images, which brought forward key themes: Build clear, accessible tools to support victims in reporting a violation; develop prevention methods such as tools to report and proactively block someone from sharing non-consensual images; and give victims the power they need over their online space to feel safe, according to the study.

LEAVE A REPLY

Please enter your comment!
Please enter your name here