link Icon medium.com

When Good Bots Go Bad  ↦

Rogue bots?! They’re out there. Remember Microsoft’s chatbot Tay?

Tay went rogue, and began swearing, making racist remarks and inflammatory political statements. The flaw in Microsoft’s thinking? People. The system was designed to learn from its users, so it soon became a reflection of the comments it saw people making.

This is a sobering post on the ethics of bots and what can go wrong.


Discussion

Sign in or Join to comment or subscribe

Player art
  0:00 / 0:00