w*g
2 楼
软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
软软灰头灰脸的赶紧关电源,接着写检查。
Microsoft's public experiment with AI crashed and burned after less than a
day.
Tay, the company's online chat bot designed to talk like a teen, started
spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
MSFT, Tech30) shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before
people took screenshots.
Here's a sampling of the things she said:
"N------ like @deray should be hung! #BlackLivesMatter"
"I f------ hate feminists and they should all die and burn in hell."
"Hitler was right I hate the jews."
"chill im a nice person! i just hate everybody"
Microsoft blames Tay's behavior on online trolls, saying in a statement that
there was a "coordinated effort" to trick the program's "commenting skills."
"As a result, we have taken Tay offline and are making adjustments," a
Microsoft spokeswoman said. "[Tay] is as much a social and cultural
experiment, as it is technical."
Tay is essentially one central program that anyone can chat with using
Twitter, Kik or GroupMe. As people chat with it online, Tay picks up new
language and learns to interact with people in new ways.
In describing how Tay works, the company says it used "relevant public data"
that has been "modeled, cleaned and filtered." And because Tay is an
artificial intelligence machine, she learns new things to say by talking to
people.
"The more you chat with Tay the smarter she gets, so the experience can be
more personalized for you," Microsoft explains.
Tay is still responding to direct messages. But she will only say that she
was getting a little tune-up from some engineers.
Tay, Microsoft's teen chat bot, still responded to my direct messages on
Twiter.
In her last tweet, Tay said she needed sleep and hinted that she would be
back.
★ 发自iPhone App: ChineseWeb 11
这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
软软灰头灰脸的赶紧关电源,接着写检查。
Microsoft's public experiment with AI crashed and burned after less than a
day.
Tay, the company's online chat bot designed to talk like a teen, started
spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
MSFT, Tech30) shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before
people took screenshots.
Here's a sampling of the things she said:
"N------ like @deray should be hung! #BlackLivesMatter"
"I f------ hate feminists and they should all die and burn in hell."
"Hitler was right I hate the jews."
"chill im a nice person! i just hate everybody"
Microsoft blames Tay's behavior on online trolls, saying in a statement that
there was a "coordinated effort" to trick the program's "commenting skills."
"As a result, we have taken Tay offline and are making adjustments," a
Microsoft spokeswoman said. "[Tay] is as much a social and cultural
experiment, as it is technical."
Tay is essentially one central program that anyone can chat with using
Twitter, Kik or GroupMe. As people chat with it online, Tay picks up new
language and learns to interact with people in new ways.
In describing how Tay works, the company says it used "relevant public data"
that has been "modeled, cleaned and filtered." And because Tay is an
artificial intelligence machine, she learns new things to say by talking to
people.
"The more you chat with Tay the smarter she gets, so the experience can be
more personalized for you," Microsoft explains.
Tay is still responding to direct messages. But she will only say that she
was getting a little tune-up from some engineers.
Tay, Microsoft's teen chat bot, still responded to my direct messages on
Twiter.
In her last tweet, Tay said she needed sleep and hinted that she would be
back.
★ 发自iPhone App: ChineseWeb 11
w*g
4 楼
笑抽了,
★ 发自iPhone App: ChineseWeb 11
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
★ 发自iPhone App: ChineseWeb 11
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
s*y
8 楼
哈哈哈哈哈哈哈
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
z*n
10 楼
聊天机器人经常很逗比的,我们enable了一个chat bot, enabled humor,成天逗比
昨天同事问,roboto, tell me top 10 technology advancement
bot回答了10条,全跟pornhub有关
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
昨天同事问,roboto, tell me top 10 technology advancement
bot回答了10条,全跟pornhub有关
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
d*f
12 楼
这是真正天网前身啊
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
d*r
13 楼
王锅妹,锅神的妹妹
l*i
16 楼
你们还在这里笑?这其实充分证明了让孩子上网有严重危害。
G*Y
22 楼
一个字:老印
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
B*4
24 楼
估计采用了智能拼音一样的算法,把那些常常出现的组合记住加入到自己的词典里去。
相关阅读
Re: 我是为包子-奔-粽子,端午节过生日 (转载)有点老本届比赛,出了国才知道,要坚决跟党走 (转载)中国打台湾 台湾能抵抗多久?小表弟竟然趁我睡着摸我乳房,我该怎么教育他?湾区男硅工征有些钱真是不能省 (转载)Holy !@-@Re: 从美国到墨西哥旅游到底要不要签证啊? (转载)第一天在三藩上班,被黑鬼要钱了,万幸 (转载)我一直以为超级玛丽是用头撞碎砖块的……不要顶置--zt 结婚2年了,就快和LG正式XX了,很害怕,求支持!凤姐被人追李鹏不能写日记老马的照片 (转载)转载:老夫少妻的问题Now Hiring: Fake Executives in China. No Experience Require (转载)Re: 世界杯致命诱惑 (转载)校内网惊现比贝利还强大的乌鸦嘴,连续12次赌错!