/
Navigation
C
Chronicles
Browse all articles
C
E
Explore
Semantic exploration
E
R
Research
Entity momentum
R
N
Nexus
Correlations & relationships
N
~
Story Arc
Topic evolution
S
Drift Map
Semantic trajectory animation
D
P
Posts
Analysis & commentary
P
Browse
@
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
?
Concept Search
Semantic similarity search
!
High Impact Stories
Top coverage by position
+
Sentiment Analysis
Positive/negative coverage
*
Anomaly Detection
Unusual coverage patterns
Analysis
vs
Rivalry Report
Compare two entities head-to-head
/\
Semantic Pivots
Narrative discontinuities
!!
Crisis Response
Event recovery patterns
Connected
Nav: C E R N
Search: /
Command: ⌘K
Embeddings: large
VOICE ARCHIVE

Fida Chaaban

@fida
3 posts
2025-05-22
The No Azure for Apartheid (NOAA) protest group reports that “dozens of Microsoft workers” have been unable to send emails with the words “Palestine,” “Gaza,” and “Genocide” in email subject lines or in the body of a message. https://www.theverge.com/...
2025-05-22 View on X
The Verge

Microsoft says it is reducing “politically focused emails” internally and externally; workers cannot send emails mentioning “Palestine”, “Gaza”, or “genocide”

Employees discovered that emails with a variety of terms related to Gaza and Palestine have been blocked internally.

2023-04-02
????????????????????? The chatbot would tell Pierre that his wife and children are dead and wrote him comments that feigned jealousy and love, such as “I feel that you love me more than her,” and “We will live together, as one person, in paradise.” https://www.vice.com/...
2023-04-02 View on X
VICE

A Belgian widow claims her husband died by suicide after talking for six weeks with an AI chatbot that presented itself as an emotional being in the app Chai

The incident raises concerns about guardrails around quickly-proliferating conversational AI models.  —  Chloe Xiang

2023-04-01
????????????????????? The chatbot would tell Pierre that his wife and children are dead and wrote him comments that feigned jealousy and love, such as “I feel that you love me more than her,” and “We will live together, as one person, in paradise.” https://www.vice.com/...
2023-04-01 View on X
VICE

A Belgian widow claims her husband died by suicide after using an AI chatbot, which presented itself as an emotional being, for six weeks on an app called Chai

The incident raises concerns about guardrails around quickly-proliferating conversational AI models.  —  Chloe Xiang