Microsoft terminates its Tay AI chatbot after she turns into a Nazi

Microsoft terminates its Tay AI chatbot after she turns into a Nazi Setting her neural net processor to read-write was a terrible mistake.
Microsoft has been forced to dunk Tay, its millennial-mimicking chatbot, into a vat of molten steel. The company has terminated her after the bot started tweeting abuse at people and went full neo-Nazi, declaring that...
Microsoft terminates its Tay AI chatbot after she turns into a Nazi

Share on Google Plus

About Unknown

Information - favorites from the Internet. NextInform
    Blogger Comment
    Facebook Comment

0 comments:

Post a Comment