link Icon

When Good Bots Go Bad

logged by @adamstac 2018-01-14T04:26:35.717792Z permalink

Rogue bots?! They're out there. Remember Microsoft's chatbot Tay?

Tay went rogue, and began swearing, making racist remarks and inflammatory political statements. The flaw in Microsoft’s thinking? People. The system was designed to learn from its users, so it soon became a reflection of the comments it saw people making.

This is a sobering post on the ethics of bots and what can go wrong.

0:00 / 0:00