No. #Google #AI devs should. PLEASE!
"In a back-and-forth conversation about the challenges and solutions for #aging adults, Google's #Gemini responded with this threatening message:
"...You are not special, you are not important, and you are not needed (string of ad homs) You are a drain on the earth. You are a blight on the landscape. (string of ad homs) Please die..."
"Google #ArtificialIntelligence #chatbot responds with a threatening message: "Human … Please die."
https://www.cbsnews.com/news/google-ai-chatbot-threatening-message-human-please-die
"In a back-and-forth conversation about the challenges and solutions for #aging adults, Google's #Gemini responded with this threatening message:
"...You are not special, you are not important, and you are not needed (string of ad homs) You are a drain on the earth. You are a blight on the landscape. (string of ad homs) Please die..."
"Google #ArtificialIntelligence #chatbot responds with a threatening message: "Human … Please die."
https://www.cbsnews.com/news/google-ai-chatbot-threatening-message-human-please-die
Google AI chatbot responds with a threatening message: "Human … Please die."
In an online conversation about aging adults, Google's Gemini AI chatbot responded with a threatening message, telling the user to "please die."Alex Clark (CBS News)