w*g
2 楼
软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
软软灰头灰脸的赶紧关电源,接着写检查。
Microsoft's public experiment with AI crashed and burned after less than a
day.
Tay, the company's online chat bot designed to talk like a teen, started
spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
MSFT, Tech30) shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before
people took screenshots.
Here's a sampling of the things she said:
"N------ like @deray should be hung! #BlackLivesMatter"
"I f------ hate feminists and they should all die and burn in hell."
"Hitler was right I hate the jews."
"chill im a nice person! i just hate everybody"
Microsoft blames Tay's behavior on online trolls, saying in a statement that
there was a "coordinated effort" to trick the program's "commenting skills."
"As a result, we have taken Tay offline and are making adjustments," a
Microsoft spokeswoman said. "[Tay] is as much a social and cultural
experiment, as it is technical."
Tay is essentially one central program that anyone can chat with using
Twitter, Kik or GroupMe. As people chat with it online, Tay picks up new
language and learns to interact with people in new ways.
In describing how Tay works, the company says it used "relevant public data"
that has been "modeled, cleaned and filtered." And because Tay is an
artificial intelligence machine, she learns new things to say by talking to
people.
"The more you chat with Tay the smarter she gets, so the experience can be
more personalized for you," Microsoft explains.
Tay is still responding to direct messages. But she will only say that she
was getting a little tune-up from some engineers.
Tay, Microsoft's teen chat bot, still responded to my direct messages on
Twiter.
In her last tweet, Tay said she needed sleep and hinted that she would be
back.
★ 发自iPhone App: ChineseWeb 11
这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
软软灰头灰脸的赶紧关电源,接着写检查。
Microsoft's public experiment with AI crashed and burned after less than a
day.
Tay, the company's online chat bot designed to talk like a teen, started
spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
MSFT, Tech30) shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before
people took screenshots.
Here's a sampling of the things she said:
"N------ like @deray should be hung! #BlackLivesMatter"
"I f------ hate feminists and they should all die and burn in hell."
"Hitler was right I hate the jews."
"chill im a nice person! i just hate everybody"
Microsoft blames Tay's behavior on online trolls, saying in a statement that
there was a "coordinated effort" to trick the program's "commenting skills."
"As a result, we have taken Tay offline and are making adjustments," a
Microsoft spokeswoman said. "[Tay] is as much a social and cultural
experiment, as it is technical."
Tay is essentially one central program that anyone can chat with using
Twitter, Kik or GroupMe. As people chat with it online, Tay picks up new
language and learns to interact with people in new ways.
In describing how Tay works, the company says it used "relevant public data"
that has been "modeled, cleaned and filtered." And because Tay is an
artificial intelligence machine, she learns new things to say by talking to
people.
"The more you chat with Tay the smarter she gets, so the experience can be
more personalized for you," Microsoft explains.
Tay is still responding to direct messages. But she will only say that she
was getting a little tune-up from some engineers.
Tay, Microsoft's teen chat bot, still responded to my direct messages on
Twiter.
In her last tweet, Tay said she needed sleep and hinted that she would be
back.
★ 发自iPhone App: ChineseWeb 11
w*g
4 楼
笑抽了,
★ 发自iPhone App: ChineseWeb 11
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
★ 发自iPhone App: ChineseWeb 11
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
s*y
8 楼
哈哈哈哈哈哈哈
(
before
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
z*n
10 楼
聊天机器人经常很逗比的,我们enable了一个chat bot, enabled humor,成天逗比
昨天同事问,roboto, tell me top 10 technology advancement
bot回答了10条,全跟pornhub有关
(
before
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
昨天同事问,roboto, tell me top 10 technology advancement
bot回答了10条,全跟pornhub有关
(
before
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
d*f
12 楼
这是真正天网前身啊
(
before
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
d*r
13 楼
王锅妹,锅神的妹妹
l*i
16 楼
你们还在这里笑?这其实充分证明了让孩子上网有严重危害。
G*Y
22 楼
一个字:老印
(
before
【在 w*****g 的大作中提到】![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
![](/moin_static193/solenoid/img/up.png)
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
B*4
24 楼
估计采用了智能拼音一样的算法,把那些常常出现的组合记住加入到自己的词典里去。
相关阅读
交警送考生上错考场的学生到另外一个考场,最后还是没赶上 (转载)finding dory回国一定要吃这个!习近平引用宋词妙喻中美新型大国关系艹,那天坦克人真正的战绩孙平遭中国美协除名 曾用女性私处特制毛笔创作 (转载)郭德刚的新相声全是胡侃啊。破笑话 -- 620能不能去北大?前世是谁埋的我 (转载)男子网上冒充"宋仲基"约炮 强奸女子未遂被抓用爱因斯坦的相对论解释 (转载)白牛无影腿各位如果你拿到5B(50亿)美元的funding,怎么做? (转载)石家庄最牛军长砸洗浴中心已经搬上银幕....Re: 哭了一天 (转载)trump自残了!Re: 巴西的海滩被安豬害了轮胎框打死强奸未遂犯的进展国外留学生现在在国内并不吃香 (转载)