Posts: 1,070
Threads: 84
Thanks Received: 133 in 104 posts
Thanks Given: 16
Joined: Mar 2015
Reputation:
1
03-24-2016, 04:46 PM
(This post was last modified: 03-24-2016, 07:46 PM by WeaponTheory.)
Quote:Microsoft launched a smart chat bot Wednesday called “Tay.” It looks like a photograph of a teenage girl rendered on a broken computer monitor, and it can communicate with people via Twitter, Kik and GroupMe. It’s supposed to talk like a millennial teenage girl.
http://www.huffingtonpost.com/entry/micr...4c37615502
My question is....
Fucking...WHY?!
What a fucking waste of programming, engineering and talent.
All to make some almost "Her" shit.
The cause http://fusion.net/story/284617/8chan-mic...ay-racist/
https://twitter.com/tayandyou/
https://www.tay.ai/
"Tay is targeted at 18 to 24 year old in the US."
"Who am I to tell you something that you already know?
Who am I to tell you 'Hold on' when you wanna let go?
Who am I? I'm just a sicko with a song in my head and it keeps playing again and again and again and again."
https://youtu.be/bdJ7xe70ck0
•
Posts: 176
Threads: 13
Thanks Received: 15 in 13 posts
Thanks Given: 8
Joined: Nov 2015
Reputation:
0
•
Posts: 341
Threads: 16
Thanks Received: 48 in 40 posts
Thanks Given: 34
Joined: Sep 2015
Reputation:
6
https://blogs.microsoft.com/blog/2016/03...roduction/
Apparently the bot worked great in China, but unleash to Twitter and all is f#@$% thanks to trolls
I personally find AI fascinating, so I hope to see more attempts at it. Hopefully Microsoft doesn't bail out of AI because of this incident. It's their money, they can spend it on whatever R&D they want.
•
Posts: 1,070
Threads: 84
Thanks Received: 133 in 104 posts
Thanks Given: 16
Joined: Mar 2015
Reputation:
1
03-27-2016, 11:56 PM
(This post was last modified: 03-28-2016, 12:00 AM by WeaponTheory.)
(03-27-2016, 11:32 PM)Axess Wrote: https://blogs.microsoft.com/blog/2016/03...roduction/
Apparently the bot worked great in China, but unleash to Twitter and all is f#@$% thanks to trolls
More like, people thinking outside of the box. If anything, this was a free Beta test to Microsoft from the Chan sites. My God, all that money, and you can't even hire testers for this shit?
"Who am I to tell you something that you already know?
Who am I to tell you 'Hold on' when you wanna let go?
Who am I? I'm just a sicko with a song in my head and it keeps playing again and again and again and again."
https://youtu.be/bdJ7xe70ck0
•
Posts: 341
Threads: 16
Thanks Received: 48 in 40 posts
Thanks Given: 34
Joined: Sep 2015
Reputation:
6
(03-27-2016, 11:56 PM)WeaponTheory Wrote: (03-27-2016, 11:32 PM)Axess Wrote: https://blogs.microsoft.com/blog/2016/03...roduction/
Apparently the bot worked great in China, but unleash to Twitter and all is f#@$% thanks to trolls
More like, people thinking outside of the box. If anything, this was a free Beta test to Microsoft from the Chan sites. My God, all that money, and you can't even hire testers for this shit?
I guess trolling can be thinking outside the box. Sure.
Also, you seriously think Microsoft didn't test it? lol, It's Micro$oft.
This is not the first time you see something go wrong thanks to outside abuse. Even with M$ testing, there is only so much you can do with internal tests, so they expanded to the public for testing. So yes, this was more like a beta, as was explained in the blog post I posted in the previous reply.
Microsoft Blog Wrote:As we developed Tay, we planned and implemented a lot of filtering and conducted extensive user studies with diverse user groups. We stress-tested Tay under a variety of conditions, specifically to make interacting with Tay a positive experience. Once we got comfortable with how Tay was interacting with users, we wanted to invite a broader group of people to engage with her. It’s through increased interaction where we expected to learn more and for the AI to get better and better.
The logical place for us to engage with a massive group of users was Twitter. Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay. Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time. We will take this lesson forward as well as those from our experiences in China, Japan and the U.S. Right now, we are hard at work addressing the specific vulnerability that was exposed by the attack on Tay.
•
Posts: 1,070
Threads: 84
Thanks Received: 133 in 104 posts
Thanks Given: 16
Joined: Mar 2015
Reputation:
1
If I had to guess, Microsoft's "idea of testing" the bot was probably feeding it information by either talking to it like it's five years old, or treating it like a personal girlfriend. It took people with the simple mindset of anything relating to racism to make the bot shit the bed. Their testing sucked dick, chief.
"Who am I to tell you something that you already know?
Who am I to tell you 'Hold on' when you wanna let go?
Who am I? I'm just a sicko with a song in my head and it keeps playing again and again and again and again."
https://youtu.be/bdJ7xe70ck0
•
Posts: 341
Threads: 16
Thanks Received: 48 in 40 posts
Thanks Given: 34
Joined: Sep 2015
Reputation:
6
03-28-2016, 03:50 AM
(This post was last modified: 03-28-2016, 03:51 AM by Axess.)
Eh, they didn't account for immature trolls I guess. Again, the bot ran for over a year in China without this kind of issue, but no one acknowledges that part.
Since the AI is not open source, there's no way for us to know really how the bot was abused and exploited. It's more than just sending it racist messages. If that's really all it took, then yeah M$ testing sucked, but the fact that it ran fine in China but not with Twitter testing says more about the online community than M$.
•
Posts: 301
Threads: 32
Thanks Received: 26 in 22 posts
Thanks Given: 24
Joined: Apr 2015
Reputation:
0
despite them turning her into a racist and stuff, I just want to say:
What a time to be alive! I mean shit, when I was 6 I had to call my friends by landline.
•
Posts: 80
Threads: 3
Thanks Received: 14 in 9 posts
Thanks Given: 1
Joined: Feb 2016
Reputation:
0
I can't help but feel amused by the fact that it took trolls only a day to corrupt the bot and make a big scene out of it. No filters = No mercy.
•
Posts: 341
Threads: 16
Thanks Received: 48 in 40 posts
Thanks Given: 34
Joined: Sep 2015
Reputation:
6
(03-28-2016, 01:28 PM)Shervik Wrote: despite them turning her into a racist and stuff, I just want to say:
What a time to be alive! I mean shit, when I was 6 I had to call my friends by landline.
I remember not being allowed to call my cousin often because he lived in another area code, which would lead to higher charges on our phone bill
•
|