Facebook will soon have new tools designed to prevent sharing of images, videos and any other content which includes child sexual abuse material (CSAM) on its platform.
These tools will warn users when they are looking to share pictures with potential CSAM material and also prevent them from searching for such content with a notification.
“We don’t allow instances of child sexual abuse or the use of our platform for inappropriate interactions with minors. We actually go the extra mile. Say when parents or grandparents sometimes share innocent pictures of their children or grandchildren in the bathtub, we don’t allow such content. We want to make sure that given the social nature of our platform we want to reduce the room for misuse as much as possible,” explained Karuna Nain, Director, Global Safety Policy at Facebook.
There will be a pop-up to CSAM content searchers, offering them help from offender diversion organisations. The pop-up will also warn about consequences of viewing illegal content.
There will also be a safety alert which will alert users when they are looking to share any viral meme with child exploitative content.
The Facebook notification will warn users about legal consequences of sharing content which can cause harm as that is against the company’s policies.