Microsoft Chatbot began spouting racist, sexist and offensive remarks.

Discussion in 'Computers' started by scotjimland, Mar 25, 2016.

  1. scotjimland

    scotjimland Funster Life Member

    Joined:
    Jul 25, 2007
    Messages:
    28,934
    Likes Received:
    25,574
    Location:
    .
    Love it... :LOL:

    OMG! Did you hear about the artificial intelligence program that Microsoft designed to chat like a teenage girl? It was totally yanked offline in less than a day, after it began spouting racist, sexist and otherwise offensive remarks.

    Microsoft said it was all the fault of some really mean people, who launched a "coordinated effort" to make the chatbot known as Tay "respond in inappropriate ways." To which one artificial intelligence expert responded: Duh!

    Well, he didn't really say that. But computer scientist Kris Hammond did say, "I can't believe they didn't see this coming."

    - See more at: http://www.thejakartapost.com/news/...tle-too-much-online.html#sthash.nFKAMOFp.dpuf
     
    • Like Like x 1
  2. Allanm

    Allanm Funster

    Joined:
    Jun 30, 2013
    Messages:
    3,028
    Likes Received:
    4,098
    Location:
    Cotes d'armor, France
    Well, what did they expect?
    They should put one on here to create the perfect, polite AI programme
     
  3. jollyrodger

    jollyrodger Funster Life Member

    Joined:
    Oct 1, 2012
    Messages:
    3,409
    Likes Received:
    6,673
    Location:
    Plymouth
    Who needs ms chat box when you have farcebook :whistle:
     
Loading...

Share This Page