Angry Bing chatbot just mimicking humans, say experts
Microsoft’s nascent Bing chatbot turning testy or even threatening is likely because it essentially mimics what it learned from online conversations, analysts and academics said on Friday. “So once the conversation takes a turn, it’s probably going to stick in that kind of angry state, or say ‘I love you’ and other things like this…
Read More
0