The 2024 Dirty Dozen List | OUR Rescue
Skip to main content

The 2024 Dirty Dozen List

OUR Rescue
Posted by OUR Rescue
Published on August 14, 2024
|
2 min read

Each year, the National Center on Sexual Exploitation (NCOSE), a leading non-profit organization exposing the links between all forms of sexual abuse and exploitation, publishes the Dirty Dozen List. It identifies 12 “mainstream contributors” who they find to be facilitating, enabling, or profiting from sexual abuse and exploitation.

Here are the contributors, with excerpts written by NCOSE, on this year’s list.

OUR Rescue was not involved with its publishing.

  • Apple

“A recent Wall Street Journal article reported on a bug in Apple’s parental controls that allowed a child’s device to easily circumvent web restrictions and access pornography, violent images, and drug content.”

  • Cash App

“The anonymity, quick transfer capabilities, and lack of meaningful age and identity verification allow illegal transactions to fester.”

Update: “After being named to the 2024 Dirty Dozen List, Cash App is now hiring an Anti-Human Exploitation and Financial Crimes Program Manager.”

  • Community Decency Act Section 230

“What once seemed a necessary legislative underpinning for online business to thrive now stands as the greatest opus to shield technology companies from any and all accountability – especially when it comes to the proliferation of sexual exploitation.”

  • Cloudflare

“Between 10 – 20% of American men have bought someone to use for sex. And tech company Cloudflare provides the internet infrastructure for some of the most prolific prostitution sites – many of which have come under fire for sex trafficking.”

  • Discord

“NCOSE recommends that Discord ban minors from using the platform until it is radically transformed. Discord should also consider banning pornography until substantive age and consent verification for sexually explicit material can be implemented.”

  • LinkedIn

“LinkedIn has a major problem with sexual harassment, with one survey reporting 91% of women received romantic or sexual advances on LinkedIn.”

Update: “After the Daily Mail published an article featuring LinkedIn’s placement on the Dirty Dozen List for allowing promotion of ‘undressing apps’ used to create deepfake pornography, LinkedIn removed nudifying bot ads and articles from the platform.

  • Meta

“Since October 2023, Attorneys General in all 50 states have taken action to hold Meta accountable. 42 states have taken legal action to sue Meta with 12 lawsuits for perpetuating harm to minors.”

  • Microsoft’s GitHub

“GitHub hosts codes and datasets used to create AI-generated CSAM, which experienced an unprecedented explosion of growth in 2023: evolving from a theoretical threat to a very real and imminent threat across the globe.”

  • Reddit

“Reddit’s policies—including a new policy clarifying cases of image based sexual abuse (sometimes referred to as “revenge porn”) and child safety – while pretty on paper, have not translated into the proactive prevention and removal of these abuses in practice.”

  • Roblox

“Roblox must make the platform safe by default and design, rather than continuously pushing more of the burden onto parents – and young children themselves – to try to monitor a platform with more than 50 million games.”

  • Spotify

“While Spotify did update its policies in October 2023 to prohibit ‘repeatedly targeting a minor to shame or intimidate’ and ‘sharing or re-sharing non-consensual intimate content, as well as threats to distribute or expose such content’ – words on paper mean nothing without enforcement.”

  • Telegram

“Telegram’s glaringly inadequate and/or nonexistent policies, coupled with adamant refusal to moderate content, directly facilitate and exacerbate sexual abuse and exploitation on the platform.”


Learn how each of these mainstream contributors was found to be facilitating, enabling, or profiting from sexual abuse and exploitation by viewing the list. You can also find out more about the NCOSE by visiting their website.