https://blogs.microsoft.com/blog/2016/03...roduction/
Apparently the bot worked great in China, but unleash to Twitter and all is f#@$% thanks to trolls
I personally find AI fascinating, so I hope to see more attempts at it. Hopefully Microsoft doesn't bail out of AI because of this incident. It's their money, they can spend it on whatever R&D they want.
(03-27-2016, 11:32 PM)Axess Wrote: [ -> ]https://blogs.microsoft.com/blog/2016/03...roduction/
Apparently the bot worked great in China, but unleash to Twitter and all is f#@$% thanks to trolls
More like, people thinking outside of the box. If anything, this was a free Beta test to Microsoft from the Chan sites. My God, all that money, and you can't even hire testers for this shit?
(03-27-2016, 11:56 PM)WeaponTheory Wrote: [ -> ] (03-27-2016, 11:32 PM)Axess Wrote: [ -> ]https://blogs.microsoft.com/blog/2016/03...roduction/
Apparently the bot worked great in China, but unleash to Twitter and all is f#@$% thanks to trolls
More like, people thinking outside of the box. If anything, this was a free Beta test to Microsoft from the Chan sites. My God, all that money, and you can't even hire testers for this shit?
I guess trolling can be thinking outside the box. Sure.
Also, you
seriously think Microsoft didn't test it? lol, It's Micro$oft.
This is not the first time you see something go wrong thanks to outside abuse. Even with M$ testing, there is only so much you can do with internal tests, so they expanded to the public for testing. So yes, this was more like a beta, as was explained in the blog post I posted in the previous reply.
Microsoft Blog Wrote:As we developed Tay, we planned and implemented a lot of filtering and conducted extensive user studies with diverse user groups. We stress-tested Tay under a variety of conditions, specifically to make interacting with Tay a positive experience. Once we got comfortable with how Tay was interacting with users, we wanted to invite a broader group of people to engage with her. It’s through increased interaction where we expected to learn more and for the AI to get better and better.
The logical place for us to engage with a massive group of users was Twitter. Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay. Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time. We will take this lesson forward as well as those from our experiences in China, Japan and the U.S. Right now, we are hard at work addressing the specific vulnerability that was exposed by the attack on Tay.
If I had to guess, Microsoft's "idea of testing" the bot was probably feeding it information by either talking to it like it's five years old, or treating it like a personal girlfriend. It took people with the simple mindset of anything relating to racism to make the bot shit the bed. Their testing sucked dick, chief.
Eh, they didn't account for immature trolls I guess. Again, the bot ran for over a year in China without this kind of issue, but no one acknowledges that part.
Since the AI is not open source, there's no way for us to know really how the bot was abused and exploited. It's more than just sending it racist messages. If that's really all it took, then yeah M$ testing sucked, but the fact that it ran fine in China but not with Twitter testing says more about the online community than M$.
despite them turning her into a racist and stuff, I just want to say:
What a time to be alive! I mean shit, when I was 6 I had to call my friends by landline.
I can't help but feel amused by the fact that it took trolls only a day to corrupt the bot and make a big scene out of it. No filters = No mercy.
(03-28-2016, 01:28 PM)Shervik Wrote: [ -> ]despite them turning her into a racist and stuff, I just want to say:
What a time to be alive! I mean shit, when I was 6 I had to call my friends by landline.
I remember not being allowed to call my cousin often because he lived in another area code, which would lead to higher charges on our phone bill