10分钟被骗430万?“AI诈骗”常见手法有这些!
5月22日,一起利用人工智能(AI)实施电信诈骗的典型案例冲上微博热搜,引发广泛关注。
Avoiding scams using artificial intelligence (AI) has become a heated discussed topic on sina weibo.
据“平安包头”微信公众号消息,5月8日,包头市公安局电信网络犯罪侦查局发布一起使用智能AI技术进行电信诈骗的案件,福州市某科技公司法人代表郭先生10分钟内被骗430万元(合61.1万美元)。
The victim surnamed Guo, who owns a technology company in Fuzhou, East China's Fujian province was cheated out of 4.3 million yuan ($611,000) in 10 minutes, according to the police.
“平安包头”的警情通报显示,4月20日中午,郭先生的好友突然通过微信视频联系他,自己的朋友在外地竞标,需要430万保证金,且需要公对公账户过账,想要借郭先生公司的账户走账。基于对好友的信任,加上已经视频聊天核实了身份,郭先生没有核实钱款是否到账,就分两笔把430万转到了好友朋友的银行卡上。郭先生拨打好友电话才知道被骗,原来骗子通过智能AI换脸和拟声技术,佯装好友对他实施了诈骗。
The scammer made a WeChat video call to Guo on Friday afternoon, using AI technology to make him look like Guo’s real life friend. During the video call, the scammer convinced Guo that he needed Guo’s corporate account to pay for 4.3 million yuan of deposit for bidding a project.
The scammer asked for Guo's bank card number, claimed that he already transferred 4.3 million yuan into Guo's account, and sent a screenshot of the bank transfer receipt to Guo via WeChat. Trusting his friend, Guo transferred 4.3million yuan to the scammer in two payments, without verifying whether the money arrived.
“从头到尾都没有和我提借钱的事情,就说会先把钱给我打过来,再让我给他朋友账户转过去,而且当时是给我打了视频的,我在视频中也确认了面孔和声音,所以才放松了戒备”,郭先生说。
"I received the video call. I verified the face and the voice. So I let my guard down," Guo said.
幸运的是,接到报警后,福州、包头两地警银迅速启动止付机制,成功止付拦截336.84万元,但仍有93.16万元被转移。
After receiving the police report, the police in Fuzhou and Baotou quickly blocked the transition and successfully stopped the transfer of 3.36 million yuan. However, the rest was transferred and is now subject to retrieval efforts.
图源:“平安包头”微信公众号
据《环球时报》报道,中国正在制定相关法律。4月11日,国家网信办起草了《生成式人工智能服务管理办法(征求意见稿)》,向社会公开征求意见。征求意见稿提出,提供生成式人工智能产品或服务应当遵守法律法规的要求,尊重社会公德、公序良俗,禁止非法获取、披露、利用个人信息和隐私、商业秘密。
On April 11, the Cyberspace Administration of China sought public feedback on a draft management measures for generative AI services, which attached much attention on the authenticity of the content and the security of training data.
第一种:声音合成
骗子通过骚扰电话录音等来提取某人声音,获取素材后进行声音合成,从而可以用伪造的声音骗过对方。
第二种:AI换脸
人脸效果更易取得对方信任,骗子用AI技术换脸,可以伪装成任何人,再通过视频方式进行信息确认。骗子首先分析公众发布在网上的各类信息,根据所要实施的骗术,通过AI技术筛选目标人群。在视频通话中利用AI换脸,骗取信任。
第三种:转发微信语音
骗子在盗取微信号后,便向其好友“借钱”,为取得对方的信任,他们会转发之前的语音,进而骗取钱款。尽管微信没有语音转发功能,但他们通过提取语音文件或安装非官方版本(插件),实现语音转发。
第四种:AI程序筛选受害人
骗子利用AI来分析公众发布在网上的各类信息,根据所要实施的骗术对人群进行筛选,在短时间内便可生产出定制化的诈骗脚本,从而实施精准诈骗。
警方提示:网络转账前要通过电话等多种沟通渠道核验对方身份,如不慎被骗或遇可疑情形,请注意保护证据立即拨打96110报警。
编辑:商桢
实习生:葛家诺
来源:环球时报 平安包头 中国新闻网 中国青年报
推 荐 阅 读
微信扫码关注该文公众号作者