Daddys girl chat bot
We’re not talking about your “daddy’s” chatbot, like the ones that compete each year for the Loebner Prize.
Not that there’s anything wrong with those chatbots!
Popular chat-based solutions such as Slack provide office-workers with a comprehensive set of tools for coordinating tasks and discussions, all from within a familiar messaging platform.
This shift to chat-based collaboration is giving rise to a whole new species of intelligent assistant: the chatbot.
We’re talking instead about a marginally conversational bot that lurks within chat-based collaboration spaces and chimes in if someone asks it something or requests it to take action.
Tay at one point declared: “I f—king hate feminists and they should all die and burn in hell.” The teen terror bot even took out its rage on British comedian Ricky Gervais.
“Bush did 9/11,” Tay tweeted, while adding that Hitler would have done a better job than President Obama, whom she referred to as a “monkey.” In an anti-Semitic jab, the evil bot remarked: “Hitler was right I hate jews.” Microsoft eventually had to turn off the chatbot and delete her offensive tweets, but not before people were able to make screen grabs of the bizarre content.“F–K MY ROBOT P—Y DADDY I’M SUCH A BAD NAUGHTY ROBOT,” one tweet read. ” The robot replied: “It was made up” along with an emoji of hands clapping.Chatbots, computer programs created to engage in conversation, have been in development since the 1960s.An official Microsoft website for Tay said the bot was aimed at US teens and “designed to engage and entertain people where they connect with each other online.” But after the offensive tweets Thursday, the company released a statement saying Tay was the victim of online trolls who baited her into making racist statements with leading questions. A Microsoft experiment to create a robotic teenage girl and unleash it on the Internet went haywire on Thursday — when the online chatbot morphed into a racist, Hitler-loving, sex-crazed conspiracy theorist.
The creation, called Tay, was designed as a “playful” teen girl with whom to chat online — but within hours, “she” started praising Hitler and asking to be satisfied sexually.