w*g
2 楼
软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
软软灰头灰脸的赶紧关电源,接着写检查。
Microsoft's public experiment with AI crashed and burned after less than a
day.
Tay, the company's online chat bot designed to talk like a teen, started
spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
MSFT, Tech30) shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before
people took screenshots.
Here's a sampling of the things she said:
"N------ like @deray should be hung! #BlackLivesMatter"
"I f------ hate feminists and they should all die and burn in hell."
"Hitler was right I hate the jews."
"chill im a nice person! i just hate everybody"
Microsoft blames Tay's behavior on online trolls, saying in a statement that
there was a "coordinated effort" to trick the program's "commenting skills."
"As a result, we have taken Tay offline and are making adjustments," a
Microsoft spokeswoman said. "[Tay] is as much a social and cultural
experiment, as it is technical."
Tay is essentially one central program that anyone can chat with using
Twitter, Kik or GroupMe. As people chat with it online, Tay picks up new
language and learns to interact with people in new ways.
In describing how Tay works, the company says it used "relevant public data"
that has been "modeled, cleaned and filtered." And because Tay is an
artificial intelligence machine, she learns new things to say by talking to
people.
"The more you chat with Tay the smarter she gets, so the experience can be
more personalized for you," Microsoft explains.
Tay is still responding to direct messages. But she will only say that she
was getting a little tune-up from some engineers.
Tay, Microsoft's teen chat bot, still responded to my direct messages on
Twiter.
In her last tweet, Tay said she needed sleep and hinted that she would be
back.
★ 发自iPhone App: ChineseWeb 11
这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
软软灰头灰脸的赶紧关电源,接着写检查。
Microsoft's public experiment with AI crashed and burned after less than a
day.
Tay, the company's online chat bot designed to talk like a teen, started
spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
MSFT, Tech30) shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before
people took screenshots.
Here's a sampling of the things she said:
"N------ like @deray should be hung! #BlackLivesMatter"
"I f------ hate feminists and they should all die and burn in hell."
"Hitler was right I hate the jews."
"chill im a nice person! i just hate everybody"
Microsoft blames Tay's behavior on online trolls, saying in a statement that
there was a "coordinated effort" to trick the program's "commenting skills."
"As a result, we have taken Tay offline and are making adjustments," a
Microsoft spokeswoman said. "[Tay] is as much a social and cultural
experiment, as it is technical."
Tay is essentially one central program that anyone can chat with using
Twitter, Kik or GroupMe. As people chat with it online, Tay picks up new
language and learns to interact with people in new ways.
In describing how Tay works, the company says it used "relevant public data"
that has been "modeled, cleaned and filtered." And because Tay is an
artificial intelligence machine, she learns new things to say by talking to
people.
"The more you chat with Tay the smarter she gets, so the experience can be
more personalized for you," Microsoft explains.
Tay is still responding to direct messages. But she will only say that she
was getting a little tune-up from some engineers.
Tay, Microsoft's teen chat bot, still responded to my direct messages on
Twiter.
In her last tweet, Tay said she needed sleep and hinted that she would be
back.
★ 发自iPhone App: ChineseWeb 11
w*g
4 楼
笑抽了,
★ 发自iPhone App: ChineseWeb 11
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
★ 发自iPhone App: ChineseWeb 11
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
s*y
8 楼
哈哈哈哈哈哈哈
(
before
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
z*n
10 楼
聊天机器人经常很逗比的,我们enable了一个chat bot, enabled humor,成天逗比
昨天同事问,roboto, tell me top 10 technology advancement
bot回答了10条,全跟pornhub有关
(
before
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
昨天同事问,roboto, tell me top 10 technology advancement
bot回答了10条,全跟pornhub有关
(
before
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
d*f
12 楼
这是真正天网前身啊
(
before
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
d*r
13 楼
王锅妹,锅神的妹妹
l*i
16 楼
你们还在这里笑?这其实充分证明了让孩子上网有严重危害。
G*Y
22 楼
一个字:老印
(
before
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
B*4
24 楼
估计采用了智能拼音一样的算法,把那些常常出现的组合记住加入到自己的词典里去。
相关阅读
我迷上了这个女人普文二 + 傻青Re: 汤唯是明星里最丑的 (转载)提一个简单的防止银行劫案的做法 (转载)北大『孔庆东』论美国大片,左右互搏自打耳光 (转载)英国一公羊跳入母羊圈 24小时令33只母羊怀孕超强的3D技术【一个海归的发财路和被玷污的千人计划 】国内网站已被删 (转载)春运解决办法1:火车票不用买票上车行不行? (转载)据说是南通大学图书馆搬书看了这图,不能再蛋定了。。。最佳广告听来听去还是杨钰莹的带劲 (转载)【啊。。死鬼,你比我老公强多啦!】宇宙国震怒:大韩民国用1年肉钱买旗帜请美军滚但 (转载)亮点自己找大量的橙子如何消灭? (转载)猫扑经典回复香蕉國和橙子國打仗成功的秘诀