Amazon reported hundreds of thousands of pieces of potential CSAM in AI training data to NCMEC in 2025; child safety officials say Amazon didn't give the source
Amazon reported hundreds of thousands of pieces of potential CSAM in AI training data to NCMEC in 2025; child safety officials say Amazon didn't give the source
The tech giant reported hundreds of thousands of cases of suspected child sexual abuse material, but won't say where it came from
An investigator says he reported 26 OnlyFans accounts suspected of containing CSAM to NCMEC and that all of the accounts were removed within a day of his report
US-based child safety group NCMEC, Canada-based CCCP, and UK-based IWF say their outreach to Telegram to flag CSAM on the platform has largely been ignored
Telegram's CEO was arrested in relation to an investigation into an unnamed person involving claims of “complicity” in distributing child sexual abuse material.
A look at the rise of financial sextortion of minors; the US' NCMEC received an average of 812 sextortion reports per week between August 2022 and August 2023
A sweeping new report sheds light on how scammers are exploiting kids online for money—and what we all must do to help prevent it.
Meta sent 27M+ reports, or 84% of the tips, of suspected CSAM to the NCMEC in 2022; some prosecutors say the volume of AI-generated tips delays investigations
For the National Center for Missing & Exploited Children … X: Ian Brown / @1br0wn : This article completely contradicts the idea encryption will stop many automated reports of child abuse leading to e...
Stanford researchers: LAION-5B, a dataset of 5B+ images used by Stability AI and others, contains 1,008+ instances of CSAM, possibly helping AI to generate CSAM
most prominently, Stable Diffusion 1.5—to see to what degree CSAM itself might be present in the training data. https://purl.stanford.edu/... Alex Stamos / @alex.stamos : Lots of people have worried a...
Meta backs the NCMEC's Take It Down tool, letting minors anonymously attach a hash of intimate images or videos, to stop sextortion on Facebook and Instagram
Meta plans to roll out the NCMEC's Take It Down tool on Facebook and Instagram, letting minors anonymously attach a hash to intimate content, to stop sextortion
Meta is taking steps to crack down on the spread of intimate images of teenagers on Facebook and Instagram.
Craig Federighi says introducing two similar features at the same time, iMessage protections for children and CSAM scanning of iCloud photos, caused confusion
no need to call it the privacy feature. We are not that gullible. All cloud service providers scan images but come on Apple Matthew Green / @matthew_d_green : In writing this op-ed for the Times, @ale...