/
Navigation
C
Chronicles
Browse all articles
C
E
Explore
Semantic exploration
E
R
Research
Entity momentum
R
N
Nexus
Correlations & relationships
N
~
Story Arc
Topic evolution
S
Drift Map
Semantic trajectory animation
D
P
Posts
Analysis & commentary
P
Browse
@
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
?
Concept Search
Semantic similarity search
!
High Impact Stories
Top coverage by position
+
Sentiment Analysis
Positive/negative coverage
*
Anomaly Detection
Unusual coverage patterns
Analysis
vs
Rivalry Report
Compare two entities head-to-head
/\
Semantic Pivots
Narrative discontinuities
!!
Crisis Response
Event recovery patterns
Connected
Nav: C E R N
Search: /
Command: ⌘K
Embeddings: large
VOICE ARCHIVE

Sarah Jamie Lewis

@sarahjamielewis
40 posts
2022-05-11
If you want a vision of the future, imagine an endless line of do-nothing, jobsworth, bureaucrats demanding you use ever less secure forms of communication - forever.
2022-05-11 View on X
Politico

Leaked proposal: the European Union plans to release a draft law this week that requires tech companies to scan for CSAM and threatens end-to-end encryption

Brussels is bracing for one of its biggest and most emotional tech fights yet as companies face stringent new rules to clamp down on sexual abuse material.

2021-09-05
Took Apple a month from first publishing the proposal to move to “we have decided to take additional time [before rolling this out]”. In a world where so much is forgotten in a 24-hour news cycle, to see a surveillance issue stay relevant for that long is somewhat optimistic.
2021-09-05 View on X
9to5Mac

Apple delays the rollout of recently announced child safety features, says it will take more time to collect feedback from stakeholders and make improvements

Last month, Apple announced a handful of new child safety features that proved to be controversial, including CSAM detection features for iCloud Photos.

2021-09-04
Took Apple a month from first publishing the proposal to move to “we have decided to take additional time [before rolling this out]”. In a world where so much is forgotten in a 24-hour news cycle, to see a surveillance issue stay relevant for that long is somewhat optimistic.
2021-09-04 View on X
9to5Mac

Apple delays the rollout of recently announced child safety features, says it will take more time to collect feedback from stakeholders and make improvements

Last month, Apple announced a handful of new child safety features that proved to be controversial, including CSAM detection features for iCloud Photos.

2021-08-19
Hashes generated using the instructions / script found here: https://github.com/...
2021-08-19 View on X
VICE

Apple says NeuralHash flaw found by researchers in CSAM detection system was in a generic version of the software and not in the final version it plans to use

Apple said the version of NeuralHash analyzed by researchers is not the final version that will be used for iCloud Photos CSAM detection.

Both these images have NeuralHash: 1e986d5d29ed011a579bfdea Just a reminder that visually similar images are not necessarily semantically similar images. https://twitter.com/...
2021-08-19 View on X
VICE

Apple says NeuralHash flaw found by researchers in CSAM detection system was in a generic version of the software and not in the final version it plans to use

Apple said the version of NeuralHash analyzed by researchers is not the final version that will be used for iCloud Photos CSAM detection.

As I said, finding specific (funny) collisions is trivial for perceptual hashes. To be fair to Apple NeuralHash does seem at least somewhat resistant to -random- collisions (over the small set of tens of thousands of images I've thrown at it today...). https://twitter.com/...
2021-08-19 View on X
VICE

Apple says NeuralHash flaw found by researchers in CSAM detection system was in a generic version of the software and not in the final version it plans to use

Apple said the version of NeuralHash analyzed by researchers is not the final version that will be used for iCloud Photos CSAM detection.

The Apple system dedupes photos, but burst shots are semantically *different* photos with the same subject - and an unlucky match on a burst shot could lead to multiple match events on the back end if the system isn't implemented to defend against that.
2021-08-19 View on X
VICE

Apple says NeuralHash flaw found by researchers in CSAM detection system was in a generic version of the software and not in the final version it plans to use

Apple said the version of NeuralHash analyzed by researchers is not the final version that will be used for iCloud Photos CSAM detection.

2021-08-15
Some quick calculations with the new numbers: 3-4 photos/day: 1 match every 286 days. 50 photos/day: 1 match every 20 days.
2021-08-15 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Anyway keep up the pressure. The fact that Apple felt it necessary to do a PR blitz today along with releasing new slivers of information regarding parametrization is a good sign.
2021-08-15 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Also the fact that they gave a single number for the threshold indicates that they are planning to use a single, global threshold. Which will result in worse privacy for heavy-use accounts, and will mean the obfuscation can be trivially broken as I explain in the article.
2021-08-15 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Some more information about NeuralHash too. They state they did not train it on CSAM images (which makes one wonder what they *did* train it on). This 100 million number needs some inspection given that there are billions of images exchanged everyday. https://twitter.com/...
2021-08-15 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Apples new threat model document contains some actual justification for the numbers! (https://www.apple.com/...) They are assuming 1/100000 false acceptance rate for NeuralHash which seems incredible low. And assuming that every photo library is larger than the actual largest one. https://twitter.com/...
2021-08-15 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Apple have given some interviews today where they explicitly state that the threshold t=30. Which means the false acceptance rate is likely an order of magnitude *more* that I calculated in this article. https://twitter.com/...
2021-08-15 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

In 2017 Whatsapp said they were seeing 4.5 billions photos shared per day. You can't extrapolate an acceptance false positive rate from 100 million tests. https://twitter.com/...
2021-08-15 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

2021-08-14
Some quick calculations with the new numbers: 3-4 photos/day: 1 match every 286 days. 50 photos/day: 1 match every 20 days.
2021-08-14 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Apples new threat model document contains some actual justification for the numbers! (https://www.apple.com/...) They are assuming 1/100000 false acceptance rate for NeuralHash which seems incredible low. And assuming that every photo library is larger than the actual largest one. https://twitter.com/...
2021-08-14 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Some more information about NeuralHash too. They state they did not train it on CSAM images (which makes one wonder what they *did* train it on). This 100 million number needs some inspection given that there are billions of images exchanged everyday. https://twitter.com/...
2021-08-14 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Also the fact that they gave a single number for the threshold indicates that they are planning to use a single, global threshold. Which will result in worse privacy for heavy-use accounts, and will mean the obfuscation can be trivially broken as I explain in the article.
2021-08-14 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Anyway keep up the pressure. The fact that Apple felt it necessary to do a PR blitz today along with releasing new slivers of information regarding parametrization is a good sign.
2021-08-14 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …

Apple have given some interviews today where they explicitly state that the threshold t=30. Which means the false acceptance rate is likely an order of magnitude *more* that I calculated in this article. https://twitter.com/...
2021-08-14 View on X
MacRumors

Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review

Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …