Human-like programs abuse our empathy – even Google engineers aren’t immune | Emily M Bender
It’s easy to be fooled by the mimicry, but consumers need transparency about how such systems are usedThe Google engineer Blake Lemoine wasn’t speaking for the company officially when he claimed that Google’s chatbot LaMDA was sentient, but Lemoine’s misconception shows the risks of designing systems in ways that convince humans they see real…
Read More
0