Microsoft Chatbot began spouting racist, sexist and offensive remarks. (1 Viewer)

scotjimland

LIFE MEMBER
Jul 25, 2007
2,309
9,978
Funster No
15
MH
A Woosh bang
Love it... :LOL:

OMG! Did you hear about the artificial intelligence program that Microsoft designed to chat like a teenage girl? It was totally yanked offline in less than a day, after it began spouting racist, sexist and otherwise offensive remarks.

Microsoft said it was all the fault of some really mean people, who launched a "coordinated effort" to make the chatbot known as Tay "respond in inappropriate ways." To which one artificial intelligence expert responded: Duh!

Well, he didn't really say that. But computer scientist Kris Hammond did say, "I can't believe they didn't see this coming."

- See more at: http://www.thejakartapost.com/news/...tle-too-much-online.html#sthash.nFKAMOFp.dpuf
 
  • Like
Reactions: DBK

Allanm

Free Member
Jun 30, 2013
5,431
9,192
Cotes d'armor, France
Funster No
26,730
MH
Burstner Harmony TI 736 G
Exp
Since 1987
Well, what did they expect?
They should put one on here to create the perfect, polite AI programme
 

Join us or log in to post a reply.

To join in you must be a member of MotorhomeFun

Join MotorhomeFun

Join us, it quick and easy!

Log in

Already a member? Log in here.

Latest journal entries

Funsters who are viewing this thread

Back
Top