w*g
2 楼
软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
软软灰头灰脸的赶紧关电源,接着写检查。
Microsoft's public experiment with AI crashed and burned after less than a
day.
Tay, the company's online chat bot designed to talk like a teen, started
spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
MSFT, Tech30) shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before
people took screenshots.
Here's a sampling of the things she said:
"N------ like @deray should be hung! #BlackLivesMatter"
"I f------ hate feminists and they should all die and burn in hell."
"Hitler was right I hate the jews."
"chill im a nice person! i just hate everybody"
Microsoft blames Tay's behavior on online trolls, saying in a statement that
there was a "coordinated effort" to trick the program's "commenting skills."
"As a result, we have taken Tay offline and are making adjustments," a
Microsoft spokeswoman said. "[Tay] is as much a social and cultural
experiment, as it is technical."
Tay is essentially one central program that anyone can chat with using
Twitter, Kik or GroupMe. As people chat with it online, Tay picks up new
language and learns to interact with people in new ways.
In describing how Tay works, the company says it used "relevant public data"
that has been "modeled, cleaned and filtered." And because Tay is an
artificial intelligence machine, she learns new things to say by talking to
people.
"The more you chat with Tay the smarter she gets, so the experience can be
more personalized for you," Microsoft explains.
Tay is still responding to direct messages. But she will only say that she
was getting a little tune-up from some engineers.
Tay, Microsoft's teen chat bot, still responded to my direct messages on
Twiter.
In her last tweet, Tay said she needed sleep and hinted that she would be
back.
★ 发自iPhone App: ChineseWeb 11
这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
软软灰头灰脸的赶紧关电源,接着写检查。
Microsoft's public experiment with AI crashed and burned after less than a
day.
Tay, the company's online chat bot designed to talk like a teen, started
spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
MSFT, Tech30) shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before
people took screenshots.
Here's a sampling of the things she said:
"N------ like @deray should be hung! #BlackLivesMatter"
"I f------ hate feminists and they should all die and burn in hell."
"Hitler was right I hate the jews."
"chill im a nice person! i just hate everybody"
Microsoft blames Tay's behavior on online trolls, saying in a statement that
there was a "coordinated effort" to trick the program's "commenting skills."
"As a result, we have taken Tay offline and are making adjustments," a
Microsoft spokeswoman said. "[Tay] is as much a social and cultural
experiment, as it is technical."
Tay is essentially one central program that anyone can chat with using
Twitter, Kik or GroupMe. As people chat with it online, Tay picks up new
language and learns to interact with people in new ways.
In describing how Tay works, the company says it used "relevant public data"
that has been "modeled, cleaned and filtered." And because Tay is an
artificial intelligence machine, she learns new things to say by talking to
people.
"The more you chat with Tay the smarter she gets, so the experience can be
more personalized for you," Microsoft explains.
Tay is still responding to direct messages. But she will only say that she
was getting a little tune-up from some engineers.
Tay, Microsoft's teen chat bot, still responded to my direct messages on
Twiter.
In her last tweet, Tay said she needed sleep and hinted that she would be
back.
★ 发自iPhone App: ChineseWeb 11
w*g
4 楼
笑抽了,
★ 发自iPhone App: ChineseWeb 11
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
★ 发自iPhone App: ChineseWeb 11
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
s*y
8 楼
哈哈哈哈哈哈哈
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
z*n
10 楼
聊天机器人经常很逗比的,我们enable了一个chat bot, enabled humor,成天逗比
昨天同事问,roboto, tell me top 10 technology advancement
bot回答了10条,全跟pornhub有关
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
昨天同事问,roboto, tell me top 10 technology advancement
bot回答了10条,全跟pornhub有关
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
d*f
12 楼
这是真正天网前身啊
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
d*r
13 楼
王锅妹,锅神的妹妹
l*i
16 楼
你们还在这里笑?这其实充分证明了让孩子上网有严重危害。
G*Y
22 楼
一个字:老印
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
(
before
【在 w*****g 的大作中提到】
: 软软这几天够倒霉的,GDC召集琐男看艳舞,被左女痛骂,高管写检查。
: 这几天软软估计看阿尔法狗眼红,放了个聊天机器人出了显派,结果器人不到两天就学
: 会了骂老黑,女左逼,还说元首当年灭米犹是正确路线。
: 软软灰头灰脸的赶紧关电源,接着写检查。
: Microsoft's public experiment with AI crashed and burned after less than a
: day.
: Tay, the company's online chat bot designed to talk like a teen, started
: spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (
: MSFT, Tech30) shut Tay down around midnight.
: The company has already deleted most of the offensive tweets, but not before
B*4
24 楼
估计采用了智能拼音一样的算法,把那些常常出现的组合记住加入到自己的词典里去。
相关阅读
说两个杀蛇后得绝症的例子 (转载)莲蓬有一种特别的美。。。大哲学家“苏亚白”江苏14岁少女当街发生肢体冲突zt习近平确实有点像蒙古人我静下心来,一分钟可以画10个逼 (转载)aixiaoxiaoyu:最近压力好大 谁上些撸材图?和朋友去吃饭,遇到个奇葩服务员。点了一道柴鸡炖蘑菇,朋友问了一句:你这是正经柴鸡吗?服务员说:柴鸡确实是柴鸡,正经不正经我真不知道。主食点的饺子,半天麼上就叫服务员问道:饺子包好了吗?服务员:已经包好台湾小生,阿东童星神双峰爆青筋ZT马航MH70因舱压故障返航河南一高中最严“禁令”男女生两次拉手将开除(图) zz买二手车Re: 李莉,你再不回家,何不来离婚? (转载)网名 zz说到女汉子大家不妨来评评心中女汉子的标准?微信上看来的,不知真假(ZT)读计算机系的朋友发了条状态:“今天请假回家,打理奶奶的丧事。刚才把和尚请到家里念经,他还给我讲了一大通,我听着大概意思是奶奶去世后,镜像传输要花七七四十九天,才能登陆阿弥陀佛服务器,之后为了build武汉卡宴快拳男九秒出拳十余次ZT学术版求教: 青年问禅师系列