- However, at one point Tay tweeted about taking drugs, in front of the police, no less.
Tay then started to tweet out of control, spamming its more than 210,000 followers with the same tweet, saying: “You are too fast, please take a rest …” over and over.
kleinbl00, it looks like this dumpster fire still has some fuel to burn.
My guess? They think Amazon is on the right path with the Echo and expect to interject themselves as a SAAS B2B layer between "hot young company" and "eager millennial consumer". It's also possible that they finally got around to watching those commercials of Katy Perry talking to Siri back before we realized that Siri is a useless bitch.Mr Nadella predicted that speech would become as important as mice and keyboards as a way to deal with computers. “Human language is the new UI [user interface] layer,” he said. “Bots are the new apps. Digital assistants are like . . . the new browsers, and intelligence is infused into all your interactions.”
Probably to confuse old people who don't understand trolls. (To be fair, sometimes I fall under the traps of trolls too. It's a human thing.) In all seriousness? I don't know. Maybe they're doing it as an outreach thing? Maybe they're trying to make an algorithm that will replace jobs like online customer support? If it was just an every day programmer doing this, I'd say they'd be doing it just to see if they could. Since it's Microsoft though, I'd guess they have an end game. The article I posted says they have a successful chatbot in Asia, so this isn't their first rodeo. Why they're having so much trouble with this one, I dunno.
God damn millenials ruin everything ! I think maybe they are trying to figure out a way to make a chatbot which is immune to this sort of thing and they figured the best way to test it was Twitter users. That's my best guess as to why anybody would think this was a good idea. Why they're having so much trouble with this one, I dunno.
I think I read in a press release somewhere that they built a lot of fail safes into the bot, but for some reason they didn't forsee this particular scenario play out. I don't know a lot about these things, but I find that a bit hard to believe. I mean, 4chan loves pulling shit like this.
"So your resume looks really impressive and you've had years of experience in the industry. Tell me, why did you leave Microsoft?" "Well, apparently you're not looked upon too kindly when you create a Trump supporting, holocaust denying, toke machine spammer of a chatbot and let it loose on Twitter." "Oh, dear. Did you really do that?" "Well, no. But I didn't prevent it either . . ."