Microsoft’s Bing chatbot has been returning some unhinged and threatening responses to users. The company has now updated the bot with three new modes that aim to fix the issue by allowing users to select how crazy the AI gets.
Gold is AU
Microsoft’s Bing chatbot has been returning some unhinged and threatening responses to users. The company has now updated the bot with three new modes that aim to fix the issue by allowing users to select how crazy the AI gets.