West Virginia's AG sues Apple for allegedly violating consumer protection law by not implementing tools like PhotoDNA to detect CSAM stored and shared on iCloud
The state's attorney general said in a lawsuit filed on Thursday that the company declined to use tools that recognize the material stored on iCloud.
West Virginia's AG sues Apple for allegedly violating consumer protection law by not implementing tools like PhotoDNA to detect CSAM stored and shared on iCloud
The state's attorney general said in a lawsuit filed on Thursday that the company declined to use tools that recognize the material stored on iCloud.
Stanford researchers: LAION-5B, a dataset of 5B+ images used by Stability AI and others, contains 1,008+ instances of CSAM, possibly helping AI to generate CSAM
most prominently, Stable Diffusion 1.5—to see to what degree CSAM itself might be present in the training data. https://purl.stanford.edu/... Alex Stamos / @alex.stamos : Lots of people have worried a...
Q&A with Discord VP of Trust and Safety John Redgrave on the content moderation challenges in the age of generative AI, exploring audio and video E2EE, and more
and easier - an interview with Discord's head of trust and safety (via @semafor) https://www.semafor.com/... Reed Albergotti / @reedalbergotti : new: Discord helped thwart high school shootings in Bra...
Stanford researchers: Twitter didn't stop uploads of 40+ known child sexual abuse images in recent months; the issue seemed fixed in May after staff were told
Social-media platform has now improved its detection system, Stanford Internet Observatory was told Mastodon: @alex@cybervillains.com and @alex@cybervillains.com Mastodon: Alex Stamos / @alex@cybervil...
Apple made an unforced error by trying to tackle CSAM and child safety issues inside the Apple Park vacuum while adhering to its annual iOS release schedule
Hello friends, and welcome back to Week in Review. — Last week, we dove into the truly bizarre machinations of the NFT market. Tweets: @evacide , @matthew_d_green , @timsweeneyepic , @neilcybart , @...
Tech companies have been failing to deal with child abuse images for years, and they still don't have a common standard for identifying illegal video content
How PhotoDNA Works — The uploaded image — in this instance a photograph of Dr. Farid — is turned into a square and colors are removed …