2025-05-22
The No Azure for Apartheid (NOAA) protest group reports that “dozens of Microsoft workers” have been unable to send emails with the words “Palestine,” “Gaza,” and “Genocide” in email subject lines or in the body of a message. https://www.theverge.com/...
The Verge
Microsoft says it is reducing “politically focused emails” internally and externally; workers cannot send emails mentioning “Palestine”, “Gaza”, or “genocide”
Employees discovered that emails with a variety of terms related to Gaza and Palestine have been blocked internally.
2023-04-02
????????????????????? The chatbot would tell Pierre that his wife and children are dead and wrote him comments that feigned jealousy and love, such as “I feel that you love me more than her,” and “We will live together, as one person, in paradise.” https://www.vice.com/...
VICE
A Belgian widow claims her husband died by suicide after talking for six weeks with an AI chatbot that presented itself as an emotional being in the app Chai
The incident raises concerns about guardrails around quickly-proliferating conversational AI models. — Chloe Xiang
2023-04-01
????????????????????? The chatbot would tell Pierre that his wife and children are dead and wrote him comments that feigned jealousy and love, such as “I feel that you love me more than her,” and “We will live together, as one person, in paradise.” https://www.vice.com/...
VICE
A Belgian widow claims her husband died by suicide after using an AI chatbot, which presented itself as an emotional being, for six weeks on an app called Chai
The incident raises concerns about guardrails around quickly-proliferating conversational AI models. — Chloe Xiang