Microsoft terminates its Tay AI chatbot after she turns into a Nazi Setting her neural net processor to read-write was a terrible mistake.
Microsoft has been forced to dunk Tay, its millennial-mimicking chatbot, into a vat of molten steel. The company has terminated her after the bot started tweeting abuse at people and went full neo-Nazi, declaring that...
Microsoft terminates its Tay AI chatbot after she turns into a Nazi
Home / Microsoft /
science /
technologies
/ Microsoft terminates its Tay AI chatbot after she turns into a Nazi
- Blogger Comment
- Facebook Comment
Subscribe to:
Post Comments
(
Atom
)
0 comments:
Post a Comment