Online child sex abuse spiked by 31% in 2020, with at least 13 million disturbing images on Facebook and Instagram

One day Facebook and other social media companies will go after child sexual abuse imagery on their platforms and remove it the way they remove conservatives.

Rise Up 'Deplorables': Rallying Round Pro-America Businesses

Melissa, a survivor of sexual abuse whose last name was not disclosed, speaks at the National Center for Missing and Exploited Children August 2, 2010 in Alexandria, Virginia.

  • There was a sharp increase in child sexual abuse imagery online in 2020, data shows.
  • Facebook said it detected 13 million images from July to September alone. 
  • Coronavirus lockdowns and livestreamed abuse fueled the increase, an expert told Insider. 

There was a sharp increase in child sex abuse imagery being posted and shared online during the coronavirus pandemic, much of it hosted on Facebook and Instagram, according to data shared exclusively with Insider. 

Figures from the National Center for Missing and Exploited Children (NCMEC) showed a 31% increase in the number of images of child sexual abuse reported to them in 2020.

The figure was up by around 5 million, from 16 million reports in 2020 to 21 million in 2021, said Yiota Souras, the lead counsel at the NCMEC. 

In 2019, Facebook recorded more child sexual abuse material than any other tech company, and was responsible for around 99% of all reports to the NCMEC.

Though a breakdown of the 2020 NCMEC figures is not yet available, Facebook said that it detected 13 million images on Facebook and Instagram from July to September alone. The figure indicates that the problem is still rampant, and may be worsening.

The vast majority of material is hosted on Facebook’s platforms

The NCMEC data comes from its CyberTipLine, which collects reports of child abuse images, videos, and other material found online.

Some reports are one-offs by members of the public. Others are sent in bulk by tech platforms that have agreed to join in to fight the proliferation of sex abuse imagery on their platforms.

The NCMEC shares the data with law enforcement agencies to find those who make and share the material, both of which are crimes.

Souras said that the NCMEC expects to see the same levels of material on Facebook in 2020, a product both of the scale of its platforms, and its proactive efforts to find and remove such material.

Google followed, with 450,000 cases. The 2020 dataset is due to be published in February.

In a statement to Insider, a Google spokesperson said that the company uses “cutting-edge technology, supported by specialized teams of human reviewers, to detect, remove, and report such content to authorities.”

By Tom Porter

Read Full Article on BusinessInsider.nl

Coronavirus: Facebook admits pandemic meant child nudity spread on Instagram

The company has admitted that it failed to detect images of child nudity due to content moderators working from home.

Facebook has admitted that the coronavirus pandemic meant that images of child nudity and sexual exploitation have been spreading on its platforms.

The tech giant said moderation levels dropped when content moderators were sent to work from home in March during the height of the COVID-19 outbreak.

Harmful material on Facebook and Instagram involving child nudity and sexual exploitation, as well as suicide and self-harm, was not caught by the company’s automated systems.

“While our technology for identifying and removing violating content is improving, there will continue to be areas where we rely on people to both review content and train our technology,” Facebook said.

“For example, we rely heavily on people to review suicide and self-injury and child exploitative content, and help improve the technology that proactively finds and removes identical or near-identical content that violates these policies.

“With fewer content reviewers, we took action on fewer pieces of content on both Facebook and Instagram for suicide and self-injury, and child nudity and sexual exploitation on Instagram.”

Investigations into Facebook’s content moderation have found that moderators are being exposed to traumatic material without receiving adequate support.

Read Original Article on News.Sky.com

Contact Your Elected Officials